About the project

The course:

The course Introduction to open data science started on Wednesday 30th October 2019.

My feelings:

I’m feeling so excited!!

I got aware of the course by an announcement some weeks ago on the University of Eastern Finland’s yammer platform. After checking the course material on MOOC platform I see great new things coming up for me to learn. So I’m very excited about it.
I’m looking forward to meet all the new things I will learn in this course and especially I’m looking forward to meet all challenges coming up in this course.
I hope I can manage all assignments.

I’m expecting to learn a lot about data handling, improve my R skills and hopefully be able to use this knowledge later on in my work.

Chapter 2: Regression and model validation

Data wrangling and regression analysis

Work of week 45 (4.11. - 10.11.2019)


1. Data wrangling

4.11.2019: Started to work through the DataCamp exercises offered by the IODS course and was able to finish

  • R Short and Sweet
  • R Helsinki Open Data Science courses

5.11.2019: Start to work on R script for data wrangling and regression analysis

  • Wrote the script for data wrangling and was able to finish that part (all tests on the script worked)

Now follows a discription through my work progress.

1.1. Read the dataframe

Script of the data read (reading the dataframe from the website)
# Read the data file ----
learning2014 <- read.table(file = 
                "https://www.mv.helsinki.fi/home/kvehkala/JYTmooc/JYTOPKYS3-data.txt", 
                sep = "\t", header = TRUE)                          
                # read the data file and assign it to the object "learning2014"

# Explore the read data ----

head(learning2014) # see the top of the dataset with the first 6 observations
str(learning2014) # check the structure of the dataset --> --> (observations = rows and variables = columns)
dim(learning2014) # check the table dimensions --> (observations = rows and variables = columns)

The dataset consists of 183 observations (rows) in 60 variables (columns).

If you want to check the data also with some diagrams, you can use for example following plots:

# Structure of some the data in plot or in a histogram (here the variable Points from)
library(ggplot2)
qplot(Attitude, Points, data = learning2014)
hist(learning2014$Points)

What follows now is the data wrangling. We have to summarize variables of deep, surface and strategic learning approaches and calclulate the mean value of the summarized variables. Other variables stay (gender, Attitude, Points), only the column names are changed.

1.2. Perform the data wrangling

Script for data wrangling

I decided to do the data wrangling into a new R object (a dataframe) which I named “lrn14_analysis”. This dataframe will later be used for the data analyses.
Here I used a more complicated way to do the data wrangling, in the meeting on Wednesday, 6.11. a way with the pipe operator %>% was presented.

# Create an analysis dataset: "lrn14_analysis" ----

# Create an analysis dataset with the variables gender, age, attitude, deep, stra, surf and points 
# (by combining questions in the learning2014 data)

library(dplyr) # load the package for data wrangling

keep_columns <- c("gender","Age","Attitude","Points") # these are the data columns which need to be kept
lrn14_analysis <- select(learning2014, one_of(keep_columns))  # assin a new object and select kept columns
colnames(lrn14_analysis) <- c("gender", "age", "attitude", "points") # change of the kept column names

# define questions (observations from variables) acc. instructions
deep_q <- c("D03", "D11", "D19", "D27", "D07", "D14", "D22", "D30","D06",  "D15", "D23", "D31")  # deep questions
surf_q <- c("SU02","SU10","SU18","SU26", "SU05","SU13","SU21","SU29","SU08","SU16","SU24","SU32") # surface questions
stra_q <- c("ST01","ST09","ST17","ST25","ST04","ST12","ST20","ST28") # strategic questions

# Select the combined variables (columns) & scale the observations (mean) and add it to the analysis dataframe
deep <- select(learning2014, one_of(deep_q))
lrn14_analysis$deep <- round(rowMeans(deep, na.rm = TRUE), digits = 2) # values are rounded to 2 digits

surf <- select(learning2014, one_of(surf_q))
lrn14_analysis$surf <- round(rowMeans(surf,na.rm = TRUE), digits = 2)

stra <- select(learning2014, one_of(stra_q))
lrn14_analysis$stra <- round(rowMeans(stra, na.rm = TRUE), digits = 2)

# devide the number of the attitude column by 10 (its a sum of 10 questions)
lrn14_analysis$attitude <- lrn14_analysis$attitude / 10

# Exclude observations where the exam points variable is zero. 
lrn14_analysis <- filter(lrn14_analysis, points > 0)

The new dataframe was created. Now the structure is checked.

# Check the analysis dataset
str(lrn14_analysis)
## 'data.frame':    166 obs. of  7 variables:
##  $ gender  : Factor w/ 2 levels "F","M": 1 2 1 2 2 1 2 1 2 1 ...
##  $ age     : int  53 55 49 53 49 38 50 37 37 42 ...
##  $ attitude: num  3.7 3.1 2.5 3.5 3.7 3.8 3.5 2.9 3.8 2.1 ...
##  $ points  : int  25 12 24 10 22 21 21 31 24 26 ...
##  $ deep    : num  3.58 2.92 3.5 3.5 3.67 4.75 3.83 3.25 4.33 4 ...
##  $ surf    : num  2.58 3.17 2.25 2.25 2.83 2.42 1.92 2.83 2.17 3 ...
##  $ stra    : num  3.38 2.75 3.62 3.12 3.62 3.62 2.25 4 4.25 3.5 ...
dim(lrn14_analysis)
## [1] 166   7

The data consists now of 7 variables (columns) and 166 observations. The object name is “lrn14_analysis”.


1.3. Safe the updated dataframe as a .txt table or .csv table

First the the working directory is set and then I save the dataframe with write.table() and the write.csv()functions.

# Set the working directory to IODS project folder ---- 
setwd("~/IODS-project") # set the wd to the IODS folder
# safe the analysis dataset to the "data" folder ----

write.table(lrn14_analysis, 
      file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt",
      sep = "\t", col.names = TRUE, row.names = TRUE)

write.csv(lrn14_analysis, 
      file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.csv",
      row.names = FALSE)

Afterwards I check if the tables can be read by R using the read.table() and read.csv() function plus the %>%operator piping to the head() function showing the first 6 observations of the dataframes.

# check if the table can be read ----
read.table(file = 
             "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt") %>% head()
##   gender age attitude points deep surf stra
## 1      F  53      3.7     25 3.58 2.58 3.38
## 2      M  55      3.1     12 2.92 3.17 2.75
## 3      F  49      2.5     24 3.50 2.25 3.62
## 4      M  53      3.5     10 3.50 2.25 3.12
## 5      M  49      3.7     22 3.67 2.83 3.62
## 6      F  38      3.8     21 4.75 2.42 3.62
read.csv(file = 
           "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.csv")  %>%head()
##   gender age attitude points deep surf stra
## 1      F  53      3.7     25 3.58 2.58 3.38
## 2      M  55      3.1     12 2.92 3.17 2.75
## 3      F  49      2.5     24 3.50 2.25 3.62
## 4      M  53      3.5     10 3.50 2.25 3.12
## 5      M  49      3.7     22 3.67 2.83 3.62
## 6      F  38      3.8     21 4.75 2.42 3.62

2. Analysis

The work on the analysis script and documentation started on 6.11.2019.

2.1. Reading the data

Read the dataset table from my data folder and checked the dataframe structure and dimensions.

# Set the working directory
setwd("~/IODS-project/data") # set work directory
# Read the data file ----
lrn14_analysis <- 
read.table(file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt", stringsAsFactors = TRUE) 

lrn14_analysis %>% str() # read the data table and check the structure
## 'data.frame':    166 obs. of  7 variables:
##  $ gender  : Factor w/ 2 levels "F","M": 1 2 1 2 2 1 2 1 2 1 ...
##  $ age     : int  53 55 49 53 49 38 50 37 37 42 ...
##  $ attitude: num  3.7 3.1 2.5 3.5 3.7 3.8 3.5 2.9 3.8 2.1 ...
##  $ points  : int  25 12 24 10 22 21 21 31 24 26 ...
##  $ deep    : num  3.58 2.92 3.5 3.5 3.67 4.75 3.83 3.25 4.33 4 ...
##  $ surf    : num  2.58 3.17 2.25 2.25 2.83 2.42 1.92 2.83 2.17 3 ...
##  $ stra    : num  3.38 2.75 3.62 3.12 3.62 3.62 2.25 4 4.25 3.5 ...

When I read the data table I set the “stringsAsFactor” agument (in the read.table function) to TRUE. So the observations in the gender column “F” and “M” will become factors - here 1 & 2. The dataframe includes 166 observations (the rows) in 7 variables (the columns).

2.2. Graphical overview and data summary

The data we analyise consists of a survey from students relating their learning approaches. The data includes the students’ gender, age, exam points and the global attitude towards statistics (consisting of a sum of 10 questions related to students attitude towards statistics, each measured on the Likert scale (1-5). The attitude value was divided by 10 in the data wrangling part to show the value in the 1-5 scale.

For a graphical overview of the dataset I used the ggpairs() function which results in several plots and correlations of the observations between the different variables.

library(ggplot2) 
library(GGally) # to show the graph these packages need to be loaded

ov_lrn14_2 <- ggpairs(lrn14_analysis, mapping = aes(col = gender), title = "Graphical overview of lrn14_analysis", 
                      lower = list(combo = wrap("facethist", bins = 20)), 
                      upper = list(continuous = wrap("cor", size = 2.8)))

ov_lrn14_2 # show the graph

The overview plot shows the data distribution of all observations of each variable with a histogramm, Q-plots (point diagrams) and the line graphs. The data colours represent the different gender of the students of the data frame. Here female students are shown in redish colour and male students in turquise colour.
In the upper right part of the overview graph the data correlations (all observations of one variable correlating to the observations of another varibable) is shown. These results give a first hint which variable might show a significant regression analysis result.

# Save the overview plot

ggsave("OV_plot_lrn14.png", 
       plot = ov_lrn14_2, path = "~/IODS-project/data/", scale = 1, dpi = 300) 
# the graph is saved as .png file in my data folder

Additionally a data summary table of the lrn14_analysis dataset was created.

# Summary table of the lrn14_analysis data ----

Sum_table <- summary(lrn14_analysis)
Sum_table
##  gender       age           attitude         points           deep     
##  F:110   Min.   :17.00   Min.   :1.400   Min.   : 7.00   Min.   :1.58  
##  M: 56   1st Qu.:21.00   1st Qu.:2.600   1st Qu.:19.00   1st Qu.:3.33  
##          Median :22.00   Median :3.200   Median :23.00   Median :3.67  
##          Mean   :25.51   Mean   :3.143   Mean   :22.72   Mean   :3.68  
##          3rd Qu.:27.00   3rd Qu.:3.700   3rd Qu.:27.75   3rd Qu.:4.08  
##          Max.   :55.00   Max.   :5.000   Max.   :33.00   Max.   :4.92  
##       surf            stra      
##  Min.   :1.580   Min.   :1.250  
##  1st Qu.:2.420   1st Qu.:2.620  
##  Median :2.830   Median :3.185  
##  Mean   :2.787   Mean   :3.121  
##  3rd Qu.:3.170   3rd Qu.:3.620  
##  Max.   :4.330   Max.   :5.000

2.3. Regression models

2.3.1. Simple regression model of point ~ attitude

I was testing different simple regression models with the exam points as dependent variable (y) using function lm().

# Simple regressions ----

# regression with gender variable
gen_lm <- lm(points ~ gender, data = lrn14_analysis)
summary(gen_lm) # not significant

# regression with age variable
age_lm <- lm(points ~ age, data = lrn14_analysis)
summary(age_lm) # not significant

These regression analysis did not result in significant outcomes. Next the I performed the regression analysis of points ~ attitude. This had following outcome:

# attitude
att_lm <- lm(points ~ attitude, data = lrn14_analysis)
att_lm_res <- summary(att_lm)
att_lm_res
## 
## Call:
## lm(formula = points ~ attitude, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -16.9763  -3.2119   0.4339   4.1534  10.6645 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  11.6372     1.8303   6.358 1.95e-09 ***
## attitude      3.5255     0.5674   6.214 4.12e-09 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.32 on 164 degrees of freedom
## Multiple R-squared:  0.1906, Adjusted R-squared:  0.1856 
## F-statistic: 38.61 on 1 and 164 DF,  p-value: 4.119e-09
knitr::kable(att_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ attitude")
Regression coefficients point ~ attitude
Estimate Std. Error t value Pr(>|t|)
(Intercept) 11.637 1.830 6.358 0
attitude 3.525 0.567 6.214 0

Here the diagnostic plots of the regression model points ~ attitude:

plot(att_lm, which = c(1,2,5))

Regression model point ~ attitude interpretation:

Significant outcome was shown in the simple regression analysis of point ~ attitude.

The summary of the regression model shows that the estimate increase of points by a value of 1 of attitude is 3.525. The p-value of 4.12e-09 *** represents that as asignificant result (it’s pretty much 0). So the students’ attitude (relating statistics) has a significant influence in the exam points outcome. So depending on the attitude the student has a higher number of points or a lower one. But the intercept shows a value of 11.63 - so even with zero attitude a student would still reach a level of 11 points at an exam.
The multiple R-squared represents the correlation coefficient. It is the square root of R-squared (the coefficient of determination). The multiple R-squared explains how strong the linear relationship is, which would be strongest with a value of 1 and weakest with a value of 0 (no relationship). Here the value is 0.1906, so not a very strong relationship, but it has an effect.

The diagnostic plots consist of a "Residual vs. Fitted plot which is used to check the linearity (or non-linearity) of the observations, eventual variance errors (observations which concentrate in a certain direction) and you can also check if there are possible outliers in your dataset.
The resulting plots of our points ~ attitude model show a good linearity and a good variance. The observations are nicely distributed over the base line and you cannot see any direction were the points might show a variance difference. There are also three outliers (145, 56, 35), but these are not disturbing the analysis and don’t have to be removed since they are still centered in the overall data point distribution.
The Normal Q-Q plot helps us to check the data distribution of our dataset (normal distributed or exponential). Our datapoints are nicely distributed over the line and are normal distributed. That means that our analysis does not need any transformed data points before any further statistical analysis (for example logarithm or square root transformation).
The Residual vs. Leverage plot helps us to identify if there are observations which might have a high impact on a statistical model. So the observations in our case show very low leverage and even the points which are further away from the others show low leverage and so cannot be considered as outliers.

Additionally here is a plot representing points vs. attitude with the regressions line. For me it makes it easier to understand the whole thing when the data is shown is a graph.

library(ggplot2)
qplot(attitude, points, data = lrn14_analysis) + geom_smooth(method = "lm")

In the graph someone can check if the estimate increase of 3.525 per one unit of attitude is shown by the regression line.


2.3.2. Multiple regression model of point ~ attitude + stra

Here I’m checking the linear regression of points ~ attitude and strategic learning approach outcome - I want to know if the strategic learning approach has a relationship to the exampoints of the students.

# points ~ attitude + stra ----

att_st_lm <- lm(points ~ attitude + stra, data = lrn14_analysis)
att_st_lm_res <- summary(att_st_lm)
att_st_lm_res
## 
## Call:
## lm(formula = points ~ attitude + stra, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -17.6482  -3.3135   0.5571   3.7966  10.9300 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   8.9725     2.3966   3.744 0.000251 ***
## attitude      3.4664     0.5652   6.134 6.27e-09 ***
## stra          0.9132     0.5345   1.709 0.089438 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.289 on 163 degrees of freedom
## Multiple R-squared:  0.2048, Adjusted R-squared:  0.195 
## F-statistic: 20.99 on 2 and 163 DF,  p-value: 7.746e-09
knitr::kable(att_st_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ attitude + stra")
Regression coefficients point ~ attitude + stra
Estimate Std. Error t value Pr(>|t|)
(Intercept) 8.972 2.397 3.744 0.000
attitude 3.466 0.565 6.134 0.000
stra 0.913 0.534 1.709 0.089

Here the diagnostic plots of the regression model points ~ attitude + stra:

plot(att_st_lm, which = c(1,2,5))

Regression model point ~ attitude + stra interpretation:

In this multiple regression I wanted to see if the strategic learning approach shows an influence on the amount of exampoints a student can reach. The outcome shows that the strategic learning approach has some influence but not a very strong one. Also the p-value of 0.08 shows that the significance of the influence is not very strong. The multiple R-squared is a bit higher compared to points ~ attitude regression model so we have a little higher correlations if we put the strategic learning approach into account.
The diagnostic plots show more or less the same results as in the regression model point ~ attitude. Data is normal distributed and no outliers are significantly disturbing the analysis.


2.3.3. Multiple regression model of point ~ deep + surf + stra

In the third regression model I put points vs. deep, surface and strategic learning approach into account. Does the combination of deep, surface and strategic learning approach have an influence in the exampoints of students.

# points ~ + deep + surf + stra ----

de_su_st_lm <- lm(points ~ deep + surf + stra, data = lrn14_analysis)
de_su_st_lm_res <- summary(de_su_st_lm)

de_su_st_lm_res
## 
## Call:
## lm(formula = points ~ deep + surf + stra, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -15.1208  -3.0725   0.5196   4.2798  10.3346 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  26.9426     5.1147   5.268 4.34e-07 ***
## deep         -0.7472     0.8659  -0.863   0.3895    
## surf         -1.6328     0.9149  -1.785   0.0762 .  
## stra          0.9850     0.5962   1.652   0.1005    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.827 on 162 degrees of freedom
## Multiple R-squared:  0.04072,    Adjusted R-squared:  0.02296 
## F-statistic: 2.292 on 3 and 162 DF,  p-value: 0.0801
knitr::kable(de_su_st_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ deep + surf + stra")
Regression coefficients point ~ deep + surf + stra
Estimate Std. Error t value Pr(>|t|)
(Intercept) 26.943 5.115 5.268 0.000
deep -0.747 0.866 -0.863 0.389
surf -1.633 0.915 -1.785 0.076
stra 0.985 0.596 1.652 0.100

Here the diagnostic plots of the regression model points ~ + deep + surf + stra:

plot(de_su_st_lm, which = c(1,2,5))

Regression model point ~ deep + surf + stra interpretation:

The outcome of this regression model shows that with a minor significance (0.0762) the surface learning approach has some indluence to the exam outcomes of students. Checking the estimate for surf show a negative value of -1.63 which means that students with a surface learning approach tent to have 1.6 points less as a test result with a increased surface learning approach value of 1. The deep learning approach seem to have no influence at all in the amount of points students reach at the exams and the stratetic approach has a non-significant positive influence.
The multiple R-squared of 0.04 is very low and explains that there is more or less no correlation between these variables and the exampoints from the students.

Here I put the three variables into a qplot() to show the relationsships of the learning approaches and the exampoints.

library(ggplot2)
par(mfrow = c(2,2))
qplot(deep, points, data = lrn14_analysis) + geom_smooth(method = "lm") + ggtitle("Points vs. deep learning approach")

qplot(surf, points, data = lrn14_analysis) + geom_smooth(method = "lm")+ ggtitle("Points vs. surface learning approach")

qplot(stra, points, data = lrn14_analysis) + geom_smooth(method = "lm")+ ggtitle("Points vs. strategic learning approach")

So, this was a trial to analyse the dataset with simple and multiple regression models. I hope the interpretations are easily understood and the diary structure is not confusing you.
I had fun preparing this first statistics and graph chapter and I’m looking forward to learn more…


Chapter 3: Logistic regression

Data wrangling and performing a logistic regression analysis

Work of week 46 (11.11. - 17.11.2019)


1. Data wrangling

R script is available on my github repository. To get to the script click here


2. Analysis

2.1. Read the prepared data set

The working directory is set using setwd() and the data file “alc” prepared in the data wrangling part is read using read.table()function. Afterwards the data frame is checked.

alc <- read.table(file = 
       "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/alc_table.txt", stringsAsFactors = TRUE) 
       # read the data file and leave the binary values as they are

# Check the data frame
dim(alc)
## [1] 382  35
str(alc)
## 'data.frame':    382 obs. of  35 variables:
##  $ school    : Factor w/ 2 levels "GP","MS": 1 1 1 1 1 1 1 1 1 1 ...
##  $ sex       : Factor w/ 2 levels "F","M": 1 1 1 1 1 2 2 1 2 2 ...
##  $ age       : int  18 17 15 15 16 16 16 17 15 15 ...
##  $ address   : Factor w/ 2 levels "R","U": 2 2 2 2 2 2 2 2 2 2 ...
##  $ famsize   : Factor w/ 2 levels "GT3","LE3": 1 1 2 1 1 2 2 1 2 1 ...
##  $ Pstatus   : Factor w/ 2 levels "A","T": 1 2 2 2 2 2 2 1 1 2 ...
##  $ Medu      : int  4 1 1 4 3 4 2 4 3 3 ...
##  $ Fedu      : int  4 1 1 2 3 3 2 4 2 4 ...
##  $ Mjob      : Factor w/ 5 levels "at_home","health",..: 1 1 1 2 3 4 3 3 4 3 ...
##  $ Fjob      : Factor w/ 5 levels "at_home","health",..: 5 3 3 4 3 3 3 5 3 3 ...
##  $ reason    : Factor w/ 4 levels "course","home",..: 1 1 3 2 2 4 2 2 2 2 ...
##  $ nursery   : Factor w/ 2 levels "no","yes": 2 1 2 2 2 2 2 2 2 2 ...
##  $ internet  : Factor w/ 2 levels "no","yes": 1 2 2 2 1 2 2 1 2 2 ...
##  $ guardian  : Factor w/ 3 levels "father","mother",..: 2 1 2 2 1 2 2 2 2 2 ...
##  $ traveltime: int  2 1 1 1 1 1 1 2 1 1 ...
##  $ studytime : int  2 2 2 3 2 2 2 2 2 2 ...
##  $ failures  : int  0 0 2 0 0 0 0 0 0 0 ...
##  $ schoolsup : Factor w/ 2 levels "no","yes": 2 1 2 1 1 1 1 2 1 1 ...
##  $ famsup    : Factor w/ 2 levels "no","yes": 1 2 1 2 2 2 1 2 2 2 ...
##  $ paid      : Factor w/ 2 levels "no","yes": 1 1 2 2 2 2 1 1 2 2 ...
##  $ activities: Factor w/ 2 levels "no","yes": 1 1 1 2 1 2 1 1 1 2 ...
##  $ higher    : Factor w/ 2 levels "no","yes": 2 2 2 2 2 2 2 2 2 2 ...
##  $ romantic  : Factor w/ 2 levels "no","yes": 1 1 1 2 1 1 1 1 1 1 ...
##  $ famrel    : int  4 5 4 3 4 5 4 4 4 5 ...
##  $ freetime  : int  3 3 3 2 3 4 4 1 2 5 ...
##  $ goout     : int  4 3 2 2 2 2 4 4 2 1 ...
##  $ Dalc      : int  1 1 2 1 1 1 1 1 1 1 ...
##  $ Walc      : int  1 1 3 1 2 2 1 1 1 1 ...
##  $ health    : int  3 3 3 5 5 5 3 1 1 5 ...
##  $ absences  : int  5 3 8 1 2 8 0 4 0 0 ...
##  $ G1        : int  2 7 10 14 8 14 12 8 16 13 ...
##  $ G2        : int  8 8 10 14 12 14 12 9 17 14 ...
##  $ G3        : int  8 8 11 14 12 14 12 10 18 14 ...
##  $ alc_use   : num  1 1 2.5 1 1.5 1.5 1 1 1 1 ...
##  $ high_use  : logi  FALSE FALSE TRUE FALSE FALSE FALSE ...
colnames(alc)
##  [1] "school"     "sex"        "age"        "address"    "famsize"   
##  [6] "Pstatus"    "Medu"       "Fedu"       "Mjob"       "Fjob"      
## [11] "reason"     "nursery"    "internet"   "guardian"   "traveltime"
## [16] "studytime"  "failures"   "schoolsup"  "famsup"     "paid"      
## [21] "activities" "higher"     "romantic"   "famrel"     "freetime"  
## [26] "goout"      "Dalc"       "Walc"       "health"     "absences"  
## [31] "G1"         "G2"         "G3"         "alc_use"    "high_use"

The data frame consists of 35 variables with 382 observations.

2.2. The data set

The provided data is from a approach on student achievement in secondary education in two Portugal schools.
It was collected with school reports and questionaires and it includes:

  • student grades
  • demographic features
  • social features
  • school related features

The data consists of two data sets regarding the students’ performance in Mathematics (mat) and Portuguese language (por). In a publication by Cortez & Silva, 2008 this data was already used.

In the data wrangling part of this exercise the two data sets of mat and por were merged into one data set named “alc”. Columns (variables) which were not used in merging the data set were combined by averaging. Since we are interested to analyse the alcohol consumption of the students we created two new variables in the data set:

  • “alc_use” –> calculated average of weekday (“Dalc”) and weekend (“Walc”) alcohol consumption
  • “high_use” –> logical values created as TRUE or FALSE depending if the “alc_use” of a student is higher than 2

All other variables are explained in detail here (check attribute information).


2.3. Performing the analyses

2.3.1. Choose variables for the analyses

The task of this exercise is to figure out the relationships between high and low alcohol consumption and other variables in this data set. I have choosen to run the analyses on following four variables:

  • sex –> student’s sex (binary: ‘F’ - female or ‘M’ - male)
  • failures –> number of past class failures (numeric: n if 1<=n<3, else 4)
  • famrel –> quality of family relationships (numeric: from 1 - very bad to 5 - excellent)
  • absences –> number of school absences (numeric: from 0 to 93)

I have choosen these four variables for the data analyses for following reasons:

  1. The student’s gender (“sex”) might have kind a great influence on the alcohol consumption. Young men tent to drink more alcohol for several reasons, women are able to control themselves (don’t take my steriotypical expressions serious please!).

  2. The number of class failures (“failures”) of a student. Someone who fails in school more often might have a higher alcohol consumption then others (personal opionion) both as a causation of higher use or as a response to failure.

  3. I think the overall quality of family relationships (“famrel”) might have a huge effect on a students alcohol consumption. I know, this is a very, very conservative point of view but in a country like Portugal (catholic, maybe more patriachic family structures) that might have an influence in young peoples alcohol consumption.

  4. The number of school absences. I think if someone is absent from school they maybe spent their time also with drinking alcohol. I know there are other things to do as well, but when I was that age drinking beer was somehow a thing to spent your time.


2.3.2. Explore the distributions of the chosen variables numerically and graphically

The task is to explore the distribution of the variables I have choosen and the relationship with alcohol consumption numerically and graphically.

library(tidyr)
library(dplyr)
library(ggplot2)
library(knitr)
library(kableExtra)

First I check how many female and male students are there:

# check the number of female and male students
alc %>% group_by(sex) %>% summarise(count = n()) -> fm
knitr::kable(fm, caption="Students") %>% kable_styling(bootstrap_options = "hover", full_width = FALSE, position = "center")
Students
sex count
F 198
M 184

2.3.2.1. General data overview

This bar plot gives us an overview over the different variables and might give a hint which variables have a strong relationship with alcohol consumption.


2.3.2.2. General alcohol consumption overview

The graph shows the average alcohol consumption count of the students. 1 is the lowest consumption, 5 the highest consumption. The amount of students with a low consumption is around 130 of 385.


2.3.2.3. High alcohol consumption of students by gender

This graph gives an overview of the alcohol consumption of students devided by gender. With this graph you can see the amount of male or female students with higher alcohol consumption easier. High alcohol consumption (> 2) is indicated with turquise colour, lower alcohol consumption is indicated in red colour. You can see that the amount of male students with a high alcohol consumption is higher compared to female students.


2.3.2.4. High alcohol consumption and student’s class failures

What is seen in this graph is that students with higher alcohol consumptuion tent to have a higher class failing rate. Female students with higher alcohol use have a higher count at 1 failure, male students have a higher count at 2 failures or 3 failures.

Generally, you see that students which have no high alcohol consumption fail less in classes.


2.3.2.5. High alcohol consumption and student’s family relationship status

There are more students with a low alcohol conumption where the family relationship is good (4-5). Interestingly the amount of students with high alcohol consumption is also increasing with the quality of the family relationship status (higher count at status 3-5).


2.3.2.6. High alcohol consumption and student school abcences

High alcohol use seems to have an influence on the rate of school absences. Students with a higher alcohol consumption have an increased absance amount. Females tent to be more absent than males, also if they don’t fall in the group of high alcohol consumers.


2.3.3. Logistic regression - explore the relationship between the choosen variables and the high alcohol consumption

2.3.3.1. Calculation of the logistic regression model

I’m using in the model the four variables I have choosen: sex, failures, famrel and absences

# cv_glm, choosen variables logistic model
cv_glm <- glm(high_use ~ sex + failures + famrel + absences, data = alc, family = "binomial")

# Summary of the model (cv_glm_sum)
cv_glm_sum <- summary(cv_glm)
## 
## Call:
## glm(formula = high_use ~ sex + failures + famrel + absences, 
##     family = "binomial", data = alc)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.1174  -0.8376  -0.5867   1.0091   2.1557  
## 
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)    
## (Intercept) -0.83877    0.53446  -1.569   0.1166    
## sexM         0.99120    0.24540   4.039 5.37e-05 ***
## failures     0.42458    0.18854   2.252   0.0243 *  
## famrel      -0.27632    0.12837  -2.153   0.0314 *  
## absences     0.09052    0.02252   4.020 5.83e-05 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 465.68  on 381  degrees of freedom
## Residual deviance: 419.78  on 377  degrees of freedom
## AIC: 429.78
## 
## Number of Fisher Scoring iterations: 4
Model coefficients of choosen variables
Estimate Std. Error z value Pr(>|z|)
(Intercept) -0.839 0.534 -1.569 0.117
sexM 0.991 0.245 4.039 0.000
failures 0.425 0.189 2.252 0.024
famrel -0.276 0.128 -2.153 0.031
absences 0.091 0.023 4.020 0.000

The model output shows nice results and also significant one. So which variables influance high alcohol consumption?

  • Being a male student significantly (p = 5.37e-05 ***) increases the probability of high alcohol consumption by ~1.
  • Class failure have a smaller probability but still have a positive effect on high consumption (p = 0.0243 *).
  • The quality of family relationships has a negative effect on high alcohol consumption (p = 0.0314 *).
  • School absences show also a significant positive effect on high alcohol consumption (5.83e-05 ***).

This is a first result. But we are not going to use these results as our final interpretation of the logistic regression. We are checking now the odds ratios of the model.


2.3.3.2. Calculating the odds ratios

Odds ratios and confidental intervals
odds ratios 2.5 % 97.5 %
(Intercept) 0.432 0.149 1.220
sexM 2.694 1.676 4.395
failures 1.529 1.057 2.228
famrel 0.759 0.589 0.976
absences 1.095 1.050 1.147

Interpretation of odds ratios: If the odds ratio is lower than 1 the risk of high alcohol consumption is lower, if it is greater than 1 the risk of high consumption increases. But we also have to have a look on the confidential interals - if the values cross the value of 1 the odds ratio cannot be significant! The odds ratios of this logistic model lead to following interpretation.

  • sexM has a value of 2.7, the confidence interval is between 1.6 and 4.4. So this oods ratio tells us that the male students have a 2.7 times higher probability of consuming alcohol on a higher level.
  • failures has a value of 1.5, the confidence interval is between 1.1 and 2.2. Failing in classes also increases the risk of high alcohol consumption by 1.5. So failing once increases the high alcohol consumption probability by 1.5.
  • famrel has a value of 0.8, the confidence interval is between 0.6 and 1. It seems that the family relationships have a negative effect on high alcohol consumption.
  • absences has a value of 1.095, the confidence interval is between 1.050 and 1.147. The amount of school absences increase the probability of higher alcohol consumption.

2.3.3.3. Explore the predictive power of the model

I calculated the probability of high alcohol use and added the data to the “alc” data frame. Then the prediction of a high alcohol consumption is defined to be a probability greater than 0.5 and also added this outcome to the “alc” data frame.
Here are the last 20 data points with the probability and prediction outputs:

##     age Pstatus failures famrel high_use probability prediction
## 363  17       T        0      3    FALSE  0.28017549      FALSE
## 364  17       T        0      3    FALSE  0.29878836      FALSE
## 365  18       T        0      5    FALSE  0.09793643      FALSE
## 366  18       T        0      4    FALSE  0.15809170      FALSE
## 367  18       T        0      5     TRUE  0.11513474      FALSE
## 368  18       T        0      4    FALSE  0.12520393      FALSE
## 369  17       T        0      4     TRUE  0.33697399      FALSE
## 370  18       T        0      3     TRUE  0.42202546      FALSE
## 371  18       T        0      4    FALSE  0.31608605      FALSE
## 372  17       T        0      4    FALSE  0.33597108      FALSE
## 373  19       T        1      4    FALSE  0.37092103      FALSE
## 374  18       T        1      5     TRUE  0.45736187      FALSE
## 375  18       T        0      5    FALSE  0.10622930      FALSE
## 376  18       T        0      4    FALSE  0.19766624      FALSE
## 377  19       T        1      5    FALSE  0.16593043      FALSE
## 378  18       T        0      4    FALSE  0.14641340      FALSE
## 379  18       T        2      2    FALSE  0.41066768      FALSE
## 380  18       T        0      1    FALSE  0.30079024      FALSE
## 381  17       T        0      2     TRUE  0.49046497      FALSE
## 382  18       T        0      4     TRUE  0.31608605      FALSE
2 x 2 cross tabulation
Prediction
FALSE TRUE
High use
FALSE 253 15
TRUE 82 32

Interpretation of the prediction:
The 2 x 2 cross table gives us the results of the model prediction for high alcohol consumption compared to the real numbers.
The model resulted in 253 correct false results and 82 false negative results. This means that the prediction fails to be correct. In 82 cases the model predicted that the student has no high alcohol consumption, but the student actually has high alcohol consumption. This false negative results should be much smaller.
In 15 cases the model predicts high alcohol consumption but is wrong, that’s a false positive prediction. 253 cases were correctly predicted as false and 32 as true for high alcohol consumption.


2.3.3.4. Graphic visualization of actual values and predictions

The graph represent the 2x2 cross table in a graphical view with the false negative values on top left of the graph. 82 cases predicted at FALSE but with a TRUE high alcohol use.

Target variable versus the predictions
Prediction
FALSE TRUE Sum
high use
FALSE 0.6623 0.0393 0.7016
TRUE 0.2147 0.0838 0.2984
Sum 0.8770 0.1230 1.0000

Is the prediction of high alcohol use correct of not?
This propability table gives us the correct predictions, false negative and false positive values as fractions of the total number. Results in fraction values between 0 and 1.
So 21.5 % of are false negative predictions! 3.9 % are false positive predictions.


2.3.3.5. Computing the loss function - testing error

To measure the performance of of the logistic regression model we perform the loss function to calculate the average number of wrong predictions of our model.

This results to following value:

## [1] 0.2539267

So 25.4% of all predictions are wrong! Every fourth prediction is incorrect (to check that you can calculate the sum of false positive and false negative values from the Target variable versus the predictions table: 21.5 + 3.9 = 25.4).
So on the training data the model has a prediction error of 25.4 %.


2.3.3.6. Cross validation - the training error

Here we are testing the how good our model is on unseen data. The loss function value is computed on a data which we did not use to to train the model. If the value is low it is good! We are performing the cross validation so that we test the logistic regression models on a defined set of observations. Here we choose the value of 10 - so the dataset will be split in 10 groups and 1 group of these 10 will be tested.

# K-fold cross-validation
library(boot)
cross_val<- cv.glm(data = alc, cost = loss_func, glmfit = cv_glm, K = 10)

The cross validation results to following value:

# average number of wrong predictions in the cross validation
cross_val$delta[1]
## [1] 0.2539267

This is the average number of wrong predictions in the cross validation. The prediction error on the testing data is higher compared to the training data!

Be aware of it that this value is changing with each new computing because the function computes everytime another set of observations with the logistic regression model.

The cross validation tells us if the model is too much in the data –> if the model is more generalized the value of cross validation shall be higher than the loss function value.

With the cross validation you check your model with the same data - the outcome should be higher number than the loss function –> that shows that the model is not too much in the data –> so the model is more kind of a general one!


Bonus tasks

Bonus task 1

The 10-fold cross-validation of my model was computed in section 2.3.3.6. above. My logistic model does not have a better performance compared to the DataCamp model (both have an error value of ~ 0.26).

I think a better model could be found - maybe more variables need to be used in the model, for example the age of the students or if students are in a relationship. These factors could make the model more accurate.


Bonus task 2

Cross-validations with different logistic regression models

Model 1: 8 variables (sex, age, Pstatus, guardian, failures, famrel, freetime, absences)

Model 2: 7 variables (sex, age, guardian, failures, famrel, freetime, absences) - Pstatus

Model 3: 6 variables (sex, age, failures, famrel, freetime, absences) - Pstatus, guardian

Model 4: 5 variables (sex, age, failures, freetime, absences) - Pstatus, guardian, famrel

Model 5: 4 variables (sex, age, failures, absences) - Pstatus, guardian, famrel, freetime

Model 6: 3 variables (sex, failures, absences) - Pstatus, guardian, famrel, freetime, age

The graph shows a red line representing the training error values and a green line representing the testing error values. You can see how the error values are changing. As more variables are used for the logistic regression model as higher the training and testing error becomes. It seems the model is then too “deep” in the data.



Chapter 4: Clustering and classification

Data wrangling and performing clustering and classification

Work of week 47 (18.11. - 24.11.2019)


1. Analysis of Boston data set

1.1. Load the data set & check the structure

# load necessary packages
library(MASS) # package includes the Boston data set
library(tidyr)
library(dplyr)
library(corrplot)
library(ggplot2)
library(GGally)
library(knitr)
library(kableExtra)

data(Boston) # load the Boston data set

1.2. The data set

The Boston data set consists of 14 variables with 506 observations. The data is bout housing values in Boston suburbs. The data set has following variables (columns):

  • crim: the per capita crime rate by town
  • zn: proportion of residential land zoned for lots over 25,000 sq.ft
  • indus: proportion of non-retail business acres per town
  • chas: Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)
  • nox: nitrogen oxides concentration (parts per 10 million)
  • rm: average number of rooms per dwelling
  • age: proportion of owner-occupied units built prior to 1940
  • dis: weighted mean of distances to five Boston employment centres
  • rad: index of accessibility to radial highways
  • tax: full-value property-tax rate per $10,000
  • ptratio: pupil-teacher ratio by town
  • black: 1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town
  • lstat: lower status of the population (percent)
  • medv: median value of owner-occupied homes in $1000s

We are interested in the later analysis on the per capita crime rate. Overall the crime rate is low and the data distribution is high on the low crime rate.

str(Boston) # check the structure
## 'data.frame':    506 obs. of  14 variables:
##  $ crim   : num  0.00632 0.02731 0.02729 0.03237 0.06905 ...
##  $ zn     : num  18 0 0 0 0 0 12.5 12.5 12.5 12.5 ...
##  $ indus  : num  2.31 7.07 7.07 2.18 2.18 2.18 7.87 7.87 7.87 7.87 ...
##  $ chas   : int  0 0 0 0 0 0 0 0 0 0 ...
##  $ nox    : num  0.538 0.469 0.469 0.458 0.458 0.458 0.524 0.524 0.524 0.524 ...
##  $ rm     : num  6.58 6.42 7.18 7 7.15 ...
##  $ age    : num  65.2 78.9 61.1 45.8 54.2 58.7 66.6 96.1 100 85.9 ...
##  $ dis    : num  4.09 4.97 4.97 6.06 6.06 ...
##  $ rad    : int  1 2 2 3 3 3 5 5 5 5 ...
##  $ tax    : num  296 242 242 222 222 222 311 311 311 311 ...
##  $ ptratio: num  15.3 17.8 17.8 18.7 18.7 18.7 15.2 15.2 15.2 15.2 ...
##  $ black  : num  397 397 393 395 397 ...
##  $ lstat  : num  4.98 9.14 4.03 2.94 5.33 ...
##  $ medv   : num  24 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9 ...
knitr::kable(head(Boston)) %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") # the data frame head
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
0.00632 18 2.31 0 0.538 6.575 65.2 4.0900 1 296 15.3 396.90 4.98 24.0
0.02731 0 7.07 0 0.469 6.421 78.9 4.9671 2 242 17.8 396.90 9.14 21.6
0.02729 0 7.07 0 0.469 7.185 61.1 4.9671 2 242 17.8 392.83 4.03 34.7
0.03237 0 2.18 0 0.458 6.998 45.8 6.0622 3 222 18.7 394.63 2.94 33.4
0.06905 0 2.18 0 0.458 7.147 54.2 6.0622 3 222 18.7 396.90 5.33 36.2
0.02985 0 2.18 0 0.458 6.430 58.7 6.0622 3 222 18.7 394.12 5.21 28.7
knitr::kable(summary(Boston)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11) %>% 
  scroll_box(width = "100%", height = "300px")# summary statistics
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
Min. : 0.00632 Min. : 0.00 Min. : 0.46 Min. :0.00000 Min. :0.3850 Min. :3.561 Min. : 2.90 Min. : 1.130 Min. : 1.000 Min. :187.0 Min. :12.60 Min. : 0.32 Min. : 1.73 Min. : 5.00
1st Qu.: 0.08204 1st Qu.: 0.00 1st Qu.: 5.19 1st Qu.:0.00000 1st Qu.:0.4490 1st Qu.:5.886 1st Qu.: 45.02 1st Qu.: 2.100 1st Qu.: 4.000 1st Qu.:279.0 1st Qu.:17.40 1st Qu.:375.38 1st Qu.: 6.95 1st Qu.:17.02
Median : 0.25651 Median : 0.00 Median : 9.69 Median :0.00000 Median :0.5380 Median :6.208 Median : 77.50 Median : 3.207 Median : 5.000 Median :330.0 Median :19.05 Median :391.44 Median :11.36 Median :21.20
Mean : 3.61352 Mean : 11.36 Mean :11.14 Mean :0.06917 Mean :0.5547 Mean :6.285 Mean : 68.57 Mean : 3.795 Mean : 9.549 Mean :408.2 Mean :18.46 Mean :356.67 Mean :12.65 Mean :22.53
3rd Qu.: 3.67708 3rd Qu.: 12.50 3rd Qu.:18.10 3rd Qu.:0.00000 3rd Qu.:0.6240 3rd Qu.:6.623 3rd Qu.: 94.08 3rd Qu.: 5.188 3rd Qu.:24.000 3rd Qu.:666.0 3rd Qu.:20.20 3rd Qu.:396.23 3rd Qu.:16.95 3rd Qu.:25.00
Max. :88.97620 Max. :100.00 Max. :27.74 Max. :1.00000 Max. :0.8710 Max. :8.780 Max. :100.00 Max. :12.127 Max. :24.000 Max. :711.0 Max. :22.00 Max. :396.90 Max. :37.97 Max. :50.00

1.2.1 Overview plot and data description

# graphical overview of the Boston data set
ov_boston <- ggpairs(Boston, mapping = aes(), title ="Overview of the Boston data set", 
                     lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 2.8)))
# overview plot of interesting data
bos_detail <- Boston[,c("crim","dis","tax","medv")]

ov_bos_detail <- ggpairs(bos_detail, mapping = aes(),lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 5)))
ov_bos_detail

The overview plot of the interesting data (for me) gives following information and the histograms give following information: the crime rate has the highest distribution on the low crime rate level. There might be some relationship with the distance to employment center. The tax rate has a high count on a lower level, decreases with a higher tax and has again a higher count on high taxes. The median value of the owner-occupied homes has a maximum count around 20000$. Tax and crime rate show a correlation of 50%.


1.2.2. Data correlations of the Boston data set

# calculate the correlation matrix and round it
cor_matrix<-cor(Boston) %>% round(digits = 2)

# print the correlation matrix
knitr::kable(cor_matrix, caption="Correlation matrix values") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(width = "100%", height = "300px")# summary statistics
Correlation matrix values
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
crim 1.00 -0.20 0.41 -0.06 0.42 -0.22 0.35 -0.38 0.63 0.58 0.29 -0.39 0.46 -0.39
zn -0.20 1.00 -0.53 -0.04 -0.52 0.31 -0.57 0.66 -0.31 -0.31 -0.39 0.18 -0.41 0.36
indus 0.41 -0.53 1.00 0.06 0.76 -0.39 0.64 -0.71 0.60 0.72 0.38 -0.36 0.60 -0.48
chas -0.06 -0.04 0.06 1.00 0.09 0.09 0.09 -0.10 -0.01 -0.04 -0.12 0.05 -0.05 0.18
nox 0.42 -0.52 0.76 0.09 1.00 -0.30 0.73 -0.77 0.61 0.67 0.19 -0.38 0.59 -0.43
rm -0.22 0.31 -0.39 0.09 -0.30 1.00 -0.24 0.21 -0.21 -0.29 -0.36 0.13 -0.61 0.70
age 0.35 -0.57 0.64 0.09 0.73 -0.24 1.00 -0.75 0.46 0.51 0.26 -0.27 0.60 -0.38
dis -0.38 0.66 -0.71 -0.10 -0.77 0.21 -0.75 1.00 -0.49 -0.53 -0.23 0.29 -0.50 0.25
rad 0.63 -0.31 0.60 -0.01 0.61 -0.21 0.46 -0.49 1.00 0.91 0.46 -0.44 0.49 -0.38
tax 0.58 -0.31 0.72 -0.04 0.67 -0.29 0.51 -0.53 0.91 1.00 0.46 -0.44 0.54 -0.47
ptratio 0.29 -0.39 0.38 -0.12 0.19 -0.36 0.26 -0.23 0.46 0.46 1.00 -0.18 0.37 -0.51
black -0.39 0.18 -0.36 0.05 -0.38 0.13 -0.27 0.29 -0.44 -0.44 -0.18 1.00 -0.37 0.33
lstat 0.46 -0.41 0.60 -0.05 0.59 -0.61 0.60 -0.50 0.49 0.54 0.37 -0.37 1.00 -0.74
medv -0.39 0.36 -0.48 0.18 -0.43 0.70 -0.38 0.25 -0.38 -0.47 -0.51 0.33 -0.74 1.00
# Specialized the insignificant value according to the significant level
p.mat <- cor.mtest(cor_matrix)$p

# visualize the correlation matrix
# correlations / colour shows the correlation values
corrplot(cor_matrix, method="circle", type="upper",  tl.cex = 0.6, p.mat = p.mat, sig.level = 0.01, title="Correlations of the Boston data set", mar=c(0,0,1,0))  

Insignificant values are shown with a cross in the square. The crime rate shows rather strong positive correlation with accessibility to highways, the property-tax rate and with the lower population status.


1.3. Data set analysis

1.3.1. Data standardization (scaling of the data set)

Here we standardize the data set and print out summaries of the scaled data set. The data scaling is a method to standardize the values (some kind of transformation), so that we can compare the observations of the different variables. That creates negative and positive values with an overall mean of 0 and a standard deviation of 1.

# center and standardize variables
boston_scaled <- scale(Boston)

# summaries of the scaled variables
knitr::kable(summary(boston_scaled), caption="Summary of scaled Boston data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center", font_size = 11)  %>% 
  scroll_box(width = "100%", height = "300px")
Summary of scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
Min. :-0.419367 Min. :-0.48724 Min. :-1.5563 Min. :-0.2723 Min. :-1.4644 Min. :-3.8764 Min. :-2.3331 Min. :-1.2658 Min. :-0.9819 Min. :-1.3127 Min. :-2.7047 Min. :-3.9033 Min. :-1.5296 Min. :-1.9063
1st Qu.:-0.410563 1st Qu.:-0.48724 1st Qu.:-0.8668 1st Qu.:-0.2723 1st Qu.:-0.9121 1st Qu.:-0.5681 1st Qu.:-0.8366 1st Qu.:-0.8049 1st Qu.:-0.6373 1st Qu.:-0.7668 1st Qu.:-0.4876 1st Qu.: 0.2049 1st Qu.:-0.7986 1st Qu.:-0.5989
Median :-0.390280 Median :-0.48724 Median :-0.2109 Median :-0.2723 Median :-0.1441 Median :-0.1084 Median : 0.3171 Median :-0.2790 Median :-0.5225 Median :-0.4642 Median : 0.2746 Median : 0.3808 Median :-0.1811 Median :-0.1449
Mean : 0.000000 Mean : 0.00000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000
3rd Qu.: 0.007389 3rd Qu.: 0.04872 3rd Qu.: 1.0150 3rd Qu.:-0.2723 3rd Qu.: 0.5981 3rd Qu.: 0.4823 3rd Qu.: 0.9059 3rd Qu.: 0.6617 3rd Qu.: 1.6596 3rd Qu.: 1.5294 3rd Qu.: 0.8058 3rd Qu.: 0.4332 3rd Qu.: 0.6024 3rd Qu.: 0.2683
Max. : 9.924110 Max. : 3.80047 Max. : 2.4202 Max. : 3.6648 Max. : 2.7296 Max. : 3.5515 Max. : 1.1164 Max. : 3.9566 Max. : 1.6596 Max. : 1.7964 Max. : 1.6372 Max. : 0.4406 Max. : 3.5453 Max. : 2.9865
# change the object to data frame
boston_scaled <- as.data.frame(boston_scaled)

knitr::kable(boston_scaled, caption="Values of scaled Boston data set") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% scroll_box(width = "100%", height = "300px")
Values of scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
-0.4193669 0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.1400750 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278
-0.4169267 -0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239
-0.4169290 -0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375
-0.4163384 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886
-0.4120741 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323
-0.4166314 -0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582
-0.4098372 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249
-0.4032966 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.1603069 0.9778406 1.0236249 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.9097999 0.4965904
-0.3955433 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.9302853 1.1163897 1.0861216 -0.5224844 -0.5769480 -1.5037485 0.3281233 2.4193794 -0.6559463
-0.4003331 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3994130 0.6154813 1.3283202 -0.5224844 -0.5769480 -1.5037485 0.3289995 0.6227277 -0.3949946
-0.3939564 0.0487240 -0.4761823 -0.2723291 -0.2648919 0.1314594 0.9138948 1.2117800 -0.5224844 -0.5769480 -1.5037485 0.3926395 1.0918456 -0.8190411
-0.4064448 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3922967 0.5089051 1.1547920 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.0863929 -0.3949946
-0.4091990 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.5630867 -1.0506607 0.7863653 -0.5224844 -0.5769480 -1.5037485 0.3705134 0.4280788 -0.0905509
-0.3468869 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4776917 -0.2406812 0.4333252 -0.6373311 -0.6006817 1.1753027 0.4406159 -0.6151835 -0.2318998
-0.3459336 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055
-0.3471625 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6413655 -0.4289659 0.3341188 -0.6373311 -0.6006817 1.1753027 0.4265954 -0.5857761 -0.2862647
-0.2975737 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4976172 -1.3952572 0.3341188 -0.6373311 -0.6006817 1.1753027 0.3305330 -0.8504426 0.0616709
-0.3289320 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4193385 0.4662746 0.2198106 -0.6373311 -0.6006817 1.1753027 0.3294377 0.2824421 -0.5472164
-0.3267801 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457
-0.3357215 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055
-0.2745709 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.0171035 1.0488914 0.0013569 -0.6373311 -0.6006817 1.1753027 0.2179309 1.1716657 -0.9712629
-0.3210451 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4549197 0.7327152 0.1031753 -0.6373311 -0.6006817 1.1753027 0.3927490 0.1648126 -0.3188837
-0.2768170 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951
-0.3051886 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060
-0.3328778 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.5132730 0.9067897 0.2871038 -0.6373311 -0.6006817 1.1753027 0.4124654 0.5106995 -0.7538032
-0.3223820 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.9758293 0.6083763 0.3132232 -0.6373311 -0.6006817 1.1753027 -0.5833190 0.5401069 -0.9386440
-0.3419866 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 0.7717932 0.4212153 -0.6373311 -0.6006817 1.1753027 0.2213265 0.3020471 -0.6450733
-0.3089856 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3382132 0.7185050 0.3126533 -0.6373311 -0.6006817 1.1753027 -0.5508966 0.6479340 -0.8407871
-0.3302353 -0.4872402 -0.4368257 -0.2723291 -0.1440749 0.2994029 0.9174474 0.3132707 -0.6373311 -0.6006817 1.1753027 0.3424724 0.0205763 -0.4493595
-0.3035587 -0.4872402 -0.4368257 -0.2723291 -0.1440749 0.5541647 0.6652169 0.2108350 -0.6373311 -0.6006817 1.1753027 0.2580207 -0.0942525 -0.1666618
-0.2886358 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198
-0.2626044 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3026319 1.1163897 0.1804414 -0.6373311 -0.6006817 1.1753027 0.2196834 0.0541848 -0.8734060
-0.2587365 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4762685 0.4769322 0.0925851 -0.6373311 -0.6006817 1.1753027 -1.3590472 2.1085012 -1.0147549
-0.2862048 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8306578 0.9387626 -0.0037245 -0.6373311 -0.6006817 1.1753027 0.0229582 0.7977717 -1.0256279
-0.2325982 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359
-0.4126414 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946
-0.4087735 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917
-0.4107848 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6185935 -0.9618471 0.0660857 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.5437656 -0.1666618
-0.3997507 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.4534965 -1.3632843 0.0248170 -0.5224844 -0.7668172 0.3438730 0.4026072 -0.3533177 0.2356387
-0.4168895 2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910
-0.4161966 2.7285450 -1.1933466 -0.2723291 -1.0933517 1.0523023 -1.8748503 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4265954 -1.4946046 1.3446835
-0.4052857 -0.4872402 -0.6161168 -0.2723291 -0.9207559 0.6907967 -2.3331282 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3147600 -1.0941039 0.4422255
-0.4036511 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766
-0.4015748 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1048002 -2.2052367 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.4138893 -0.7300124 0.2356387
-0.4058380 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3069017 -1.0151352 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3583550 -0.4345381 -0.1449159
-0.4001727 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.8576995 -1.2353928 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 -0.3421149 -0.3515026
-0.3982033 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917
-0.3934472 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3624084 0.6012712 0.8996287 -0.7521778 -1.0397541 -0.2566040 0.3950493 0.8607875 -0.6450733
-0.3905872 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -1.2604793 0.9494202 0.9853955 -0.7521778 -1.0397541 -0.2566040 0.4406159 2.5426103 -0.8842790
-0.3945516 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.9715595 -0.2335761 1.0887811 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.4966960 -0.3406296
-0.4097861 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4577662 -0.8126404 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4259382 0.1115992 -0.3080107
-0.4150596 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268
-0.4138702 0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577
-0.4143109 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899
-0.4185206 2.7285450 -1.0402932 -0.2723291 -1.2486880 -0.5645100 -0.7451421 1.6738568 -0.7521778 0.3605309 1.2214933 0.4406159 0.3006467 -0.3949946
-0.4185775 3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484
-0.4177126 3.1573164 -1.5154874 -0.2723291 -1.2486880 0.1399989 -1.1678945 2.5609210 -0.8670245 -0.5650812 -0.5337472 0.4406159 -0.9638712 0.2356387
-0.4184369 3.8004735 -1.4309437 -0.2723291 -1.2400582 0.7562662 -0.9973725 2.1511780 -0.5224844 -0.9032856 -1.5499390 0.3968018 -1.2187352 0.9858749
-0.4021456 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.1987347 -1.3988097 1.9089794 -0.1779443 -0.7371501 0.5748257 0.3724850 -0.8112328 0.0834169
-0.4080945 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.5090032 -0.7593522 1.4897384 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.4807497 -0.3188837
-0.4027420 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.7737278 -0.0843694 1.6290739 -0.1779443 -0.7371501 0.5748257 0.4210091 0.0695886 -0.4167406
-0.4001390 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112
-0.4072819 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860
-0.4053950 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.6794107 -0.8943488 1.9878602 -0.1779443 -0.7371501 0.5748257 0.4261573 -0.4415399 0.2682577
-0.4178335 0.2631097 -1.4221978 -0.2723291 -1.1960462 1.1661623 -0.3223896 2.5776850 -0.7521778 -1.1406221 0.0667298 0.4005260 -0.6445909 1.1380967
-0.4159350 2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628
-0.4150107 2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296
-0.4133715 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320
-0.4043440 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.9829455 -1.1288166 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 0.0611865 -0.5580894
-0.4052020 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348
-0.4098407 -0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738
-0.4016445 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.4606127 -1.8144571 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2219837 -0.3883265 -0.0905509
-0.4094478 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.3125947 -2.1590536 0.7086718 -0.6373311 -0.6125485 0.3438730 0.3750043 -0.9988800 0.0290519
-0.3973860 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.0564097 -2.2158943 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2245030 -0.7160089 0.0942899
-0.4109219 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0165586 -2.2229994 0.2167712 -0.5224844 -0.0607412 0.1129203 0.4189279 -0.8224356 0.1704008
-0.4090432 -0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699
-0.4082980 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0080191 0.2104916 0.1221238 -0.5224844 -0.0607412 0.1129203 0.1860561 -0.0956529 -0.2753917
-0.4099791 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078
-0.4135377 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0749119 -0.5284370 0.5789293 -0.5224844 -0.0607412 0.1129203 0.3256039 -0.0438399 -0.1449159
-0.4103511 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728
-0.4153200 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473
-0.4149142 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.4758859 0.0648374 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4267049 -0.7608201 0.1486548
-0.4158478 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.0247156 -1.2922335 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -0.8308377 0.2465117
-0.4159734 0.5846882 -0.9149352 -0.2723291 -1.1106113 -0.1674232 -0.7771150 0.7625253 -0.6373311 -0.7549503 0.2514920 0.3720469 -0.7202099 0.0399249
-0.4142202 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548
-0.4134343 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.4915417 -0.4431760 0.3051974 -0.7521778 -0.9566863 0.0205393 0.3902297 -0.8574444 0.4422255
-0.4140702 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670
-0.4117881 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.2328927 -0.4183083 -0.0225305 -0.7521778 -0.9566863 0.0205393 0.4214472 -0.5899772 -0.0361860
-0.4135215 -0.4872402 -1.1262946 -0.2723291 -0.5669346 1.0281070 0.6296915 -0.1773001 -0.8670245 -0.8202179 -0.3027945 0.4406159 -1.0016807 0.1160358
-0.4139377 -0.4872402 -1.1262946 -0.2723291 -0.5669346 1.1305810 -0.1944981 -0.1807194 -0.8670245 -0.8202179 -0.3027945 0.4314149 -0.9736736 0.6705582
-0.4146561 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060
-0.4155304 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1713104 0.1891763 -0.3338269 -0.8670245 -0.8202179 -0.3027945 0.4039216 -0.6235856 -0.0579320
-0.4152153 0.7133196 0.5689534 -0.2723291 -0.7826793 0.2239706 -0.5319896 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4199137 -0.6291870 0.0399249
-0.4167593 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577
-0.4151096 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.0507166 0.3099628 -0.0855021 -0.6373311 -0.8202179 -0.1180323 0.4406159 -0.2889015 -0.2101538
-0.4059135 -0.4872402 -1.2020925 -0.2723291 -0.9466453 0.4844254 -0.3827828 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.0143049 -0.8406402 0.6379392
-0.4067273 -0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699
-0.4060542 -0.4872402 -1.2020925 -0.2723291 -0.9466453 2.5395987 0.2637797 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -1.1823261 1.7578570
-0.4105836 -0.4872402 -1.2020925 -0.2723291 -0.9466453 2.1852094 -1.1252640 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4037025 -1.2719486 2.3123794
-0.4121264 -0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427
-0.4028187 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824
-0.4068110 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.7064525 0.0968103 -0.4459031 -0.5224844 -0.1438090 1.1291122 0.4261573 -0.6978043 0.4313525
-0.3935065 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.1713104 0.5977186 -0.5130539 -0.5224844 -0.1438090 1.1291122 -3.1313265 -0.2833001 -0.4276135
-0.3955003 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2101207 0.6687695 -0.5130539 -0.5224844 -0.1438090 1.1291122 0.4139988 0.1101988 -0.3515026
-0.4038720 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187
-0.4046835 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6171702 0.9991558 -0.8016976 -0.5224844 -0.1438090 1.1291122 0.4093984 0.5345055 -0.3297567
-0.4001983 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567
-0.4048521 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998
-0.4052183 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.2695146 1.0133660 -0.6468804 -0.5224844 -0.1438090 1.1291122 0.4224331 -0.0536423 -0.2971377
-0.3894525 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.0791817 0.8037660 -0.5935968 -0.5224844 -0.1438090 1.1291122 0.3785094 0.4056731 -0.3406296
-0.4075539 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509
-0.4083782 -0.4872402 -0.1642450 -0.2723291 -0.0664067 0.6125180 0.4627220 -0.5307201 -0.4076377 0.1409947 -0.3027945 0.4262668 -0.3491166 0.0290519
-0.4057682 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5289287 0.8641592 -0.6846349 -0.4076377 0.1409947 -0.3027945 0.4192565 0.4980964 -0.4058676
-0.3942784 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.2741669 0.9529728 -0.5922195 -0.4076377 0.1409947 -0.3027945 0.4406159 0.6213273 -0.4167406
-0.4035570 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865
-0.4001820 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325
-0.4048044 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.1546139 0.1394408 -0.5057404 -0.4076377 0.1409947 -0.3027945 0.4011832 -0.0858504 -0.1449159
-0.4025490 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756
-0.4049207 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998
-0.4032721 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026
-0.4120810 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.5901285 0.0399696 -0.7300828 -0.8670245 -1.3067576 0.2976825 0.3557261 0.2404316 -0.0579320
-0.4117718 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.3994130 0.5515356 -0.7587192 -0.8670245 -1.3067576 0.2976825 0.2299797 0.2264281 -0.2427728
-0.4092908 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268
-0.4026188 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624
-0.4086514 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.5773192 0.9671829 -0.8494724 -0.8670245 -1.3067576 0.2976825 0.2487102 0.6899446 -0.4058676
-0.4004517 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4250315 0.7042949 -0.8558361 -0.8670245 -1.3067576 0.2976825 0.3104881 0.3020471 -0.1231699
-0.3750691 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.9559038 0.9600779 -0.9677698 -0.8670245 -1.3067576 0.2976825 0.0286541 2.0454854 -0.7429302
-0.3899734 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653
-0.3822678 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2083149 1.0737592 -0.9415079 -0.6373311 0.1706618 1.2676838 0.4406159 0.3832675 -0.4928515
-0.3176492 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.9217458 0.9281050 -0.8620098 -0.6373311 0.1706618 1.2676838 0.4406159 0.7963713 -0.8951520
-0.3805669 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2467426 1.0773117 -0.7961887 -0.6373311 0.1706618 1.2676838 0.4202424 -0.0074307 -0.3623756
-0.2814126 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837
-0.3515035 -0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979
-0.3817574 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.6584445 0.9529728 -0.6293092 -0.6373311 0.1706618 1.2676838 0.3506875 0.3328548 -0.4493595
-0.3066139 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.7509558 1.0595490 -0.6881492 -0.6373311 0.1706618 1.2676838 -1.0286891 0.6521351 -0.7538032
-0.3552552 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0716829 1.0524439 -0.7998930 -0.6373311 0.1706618 1.2676838 0.4161895 0.6031228 -0.4819785
-0.3825921 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.4876545 0.8854745 -0.8681835 -0.6373311 0.1706618 1.2676838 0.2363328 0.5947207 -0.5580894
-0.3791404 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2410496 1.0595490 -0.9237941 -0.6373311 0.1706618 1.2676838 0.4097270 0.2712393 -0.5907084
-0.3910604 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.6086307 1.0524439 -1.0098459 -0.6373311 0.1706618 1.2676838 0.3873818 1.2136763 -1.0038819
-0.3567968 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.1901952 1.0417863 -1.0097984 -0.6373311 0.1706618 1.2676838 0.4406159 0.8131756 -0.5145975
-0.3862822 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.1574604 0.8890270 -1.0367727 -0.6373311 0.1706618 1.2676838 0.3440059 1.6113762 -0.9277710
-0.2307590 -0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790
-0.0340024 -0.4872402 1.2307270 3.6647712 2.7296452 -1.2547863 1.1163897 -1.1746359 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9838699 -0.9930089
0.0562546 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.1622751 1.1163897 -1.1318000 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9278558 -0.7538032
-0.0969342 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.9664114 1.0382338 -1.1630958 -0.5224844 -0.0310742 -1.7347012 0.4406159 2.3297568 -1.1669767
-0.1434839 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2200834 1.1163897 -1.1283332 -0.5224844 -0.0310742 -1.7347012 -2.0128627 2.1211044 -0.9495170
-0.1695595 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.9345550 1.1163897 -1.0820306 -0.5224844 -0.0310742 -1.7347012 -2.0527336 0.5597119 -0.7538032
-0.1447302 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.9336767 0.9636304 -1.1085299 -0.5224844 -0.0310742 -1.7347012 0.3837671 2.3633653 -0.8625331
-0.1491050 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975
-0.1022553 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.9786758 0.9352101 -1.0777090 -0.5224844 -0.0310742 -1.7347012 -0.0528401 1.2318808 -0.7755492
-0.2275084 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969
-0.2461422 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.2533631 1.1163897 -1.0464131 -0.5224844 -0.0310742 -1.7347012 -0.1651137 0.0877932 -0.3188837
-0.2891275 -0.4872402 1.2307270 3.6647712 2.7296452 -1.8112772 0.6900847 -1.0375800 -0.5224844 -0.0310742 -1.7347012 -0.1467118 -0.0746476 -0.7864221
-0.1702419 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.8192718 1.0631016 -1.0314063 -0.5224844 -0.0310742 -1.7347012 -1.0375614 0.4392816 -0.3406296
-0.2557300 -0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814
-0.0091278 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032
-0.1356551 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.4412321 0.9032372 -0.9776477 -0.5224844 -0.0310742 -1.7347012 -2.9360253 0.4882939 -1.0256279
-0.2778505 -0.4872402 1.2307270 -0.2723291 0.4341211 0.9370190 1.0240236 -0.9107344 -0.5224844 -0.0310742 -1.7347012 0.0740016 -1.1291127 2.0405547
-0.2639855 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.3111714 1.1163897 -0.9677223 -0.5224844 -0.0310742 -1.7347012 -0.0304949 -0.8714479 0.1921468
-0.2544314 -0.4872402 1.2307270 -0.2723291 2.7296452 0.3207517 1.1163897 -0.9636382 -0.5224844 -0.0310742 -1.7347012 0.0836407 -0.7370141 0.0834169
-0.2720515 -0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174
-0.2499741 -0.4872402 1.2307270 -0.2723291 0.4341211 1.7141136 0.7895559 -0.8662839 -0.5224844 -0.0310742 -1.7347012 0.1944903 -1.5296134 2.9865046
-0.2069109 -0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046
-0.2435032 -0.4872402 1.2307270 3.6647712 0.4341211 2.9751133 0.8996847 -0.7755306 -0.5224844 -0.0310742 -1.7347012 0.3480587 -1.3069574 2.9865046
-0.1594090 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789
-0.0801628 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577
-0.1864006 -0.4872402 1.2307270 -0.2723291 0.4341211 2.3403437 0.9813931 -0.8306664 -0.5224844 -0.0310742 -1.7347012 0.1382988 -1.2537440 2.9865046
-0.2108044 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5801657 0.3774611 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -1.4137053 -0.0718469 0.1377818
-0.1526614 -0.4872402 1.2307270 -0.2723291 0.4341211 0.0489109 0.9778406 -0.8049744 -0.5224844 -0.0310742 -1.7347012 -0.6526548 -0.2174835 0.1377818
-0.1353238 -0.4872402 1.2307270 -0.2723291 0.4341211 0.1670406 0.9458677 -0.7278033 -0.5224844 -0.0310742 -1.7347012 -0.2917364 -0.1866758 -0.0253130
-0.2797292 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894
-0.1510919 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486
-0.4039255 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -1.0142570 0.7078474 -0.5693769 -0.5224844 -0.6659492 -0.8570810 0.4406159 0.2852429 0.0616709
-0.4094315 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358
-0.4102814 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.6057842 0.0044442 -0.5191326 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.4219349 0.0073060
-0.4123542 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.3719887 -1.2602606 -0.3147360 -0.5224844 -0.6659492 -0.8570810 0.3755520 -1.0254866 0.7466691
-0.4119380 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.3766409 -0.7593522 -0.1140436 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.3561184 0.0725439
-0.4137947 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.0432179 0.1714136 -0.2267846 -0.5224844 -0.6659492 -0.8570810 0.4263763 -0.8910529 0.2247657
-0.4123798 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341
-0.4133820 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622
-0.4124426 -0.4872402 -1.2647715 -0.2723291 -0.5755644 2.1069307 0.5231153 -0.5005640 -0.7521778 -1.2770905 -0.3027945 0.4259382 -0.7132081 1.8774599
-0.4120938 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323
-0.4095187 -0.4872402 -1.2647715 -0.2723291 -0.5755644 1.2387480 0.8392915 -0.5197499 -0.7521778 -1.2770905 -0.3027945 0.4101651 -1.0969046 1.6708731
-0.4084666 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.3961839 0.9600779 -0.4502247 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.9764743 1.0837317
-0.4104430 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.9687130 0.7540305 -0.3833114 -0.7521778 -1.2770905 -0.3027945 0.3759901 0.1858179 0.4204795
-0.4130715 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151
-0.4135889 -0.4872402 -1.2647715 -0.2723291 -0.5755644 2.2008652 -0.5319896 -0.2829652 -0.7521778 -1.2770905 -0.3027945 0.3938444 -1.1487176 2.9865046
-0.4109463 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668
-0.4054776 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.3862212 -1.4023623 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.2866094 -1.1333138 0.7901611
-0.4103709 1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2814456 -1.0542132 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.4406159 -1.0170845 1.3446835
-0.4095594 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.9484050 -1.6723554 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.2300893 -1.0576947 1.5730162
-0.4120671 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720
-0.4100291 1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2714828 -1.5018334 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3704038 -1.3699732 1.5077783
-0.4175591 2.0853880 -1.1962619 -0.2723291 -1.3263561 0.7334942 -2.0844502 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.4019500 -1.0674972 0.9315099
-0.4184287 2.0853880 -1.1962619 -0.2723291 -1.3263561 0.4545372 -1.7682741 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.2193548 -1.1585201 0.7140502
-0.4184962 2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046
-0.4154386 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4266171 -1.2247352 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.4406159 -1.2005307 1.1707156
-0.4146771 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260
-0.4157211 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4081148 -1.0755284 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.3891344 -0.8448412 1.3120645
-0.4164395 3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835
-0.4180346 3.5860878 -1.4090789 -0.2723291 -1.3090965 1.2102830 -1.9423486 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.3026016 -1.1487176 1.1272237
-0.4160966 3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008
-0.4175707 3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845
-0.4160210 3.5860878 -1.2327031 -0.2723291 -1.1960462 2.2321767 -1.2567081 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3954874 -1.2383402 2.8234098
-0.4177661 3.5860878 -1.2327031 -0.2723291 -1.1960462 2.4897850 -1.3028911 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3710611 -1.3685729 2.9865046
-0.4042417 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060
-0.3933983 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197
-0.3908058 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.7139512 0.1465458 0.2658758 -0.6373311 -0.7786840 0.0667298 0.3587931 0.7571615 -0.0035670
-0.4043057 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.3140179 -0.3365998 0.2109299 -0.6373311 -0.7786840 0.0667298 0.2699601 0.2810418 0.2030197
-0.3694468 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.3387581 1.1163897 0.0379717 -0.6373311 -0.7786840 0.0667298 0.4406159 1.4615386 -0.2753917
-0.3998193 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.4620360 0.8357389 0.0389689 -0.6373311 -0.7786840 0.0667298 0.4006356 0.6465337 -0.0905509
-0.3764142 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026
-0.3948516 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400
-0.4037651 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203
-0.3864391 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -1.2419771 -2.0880028 -0.0985619 -0.6373311 -0.7786840 0.0667298 -0.0848244 2.3661660 0.1269088
-0.3970802 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577
-0.4148003 -0.4872402 0.4013236 3.6647712 -0.0405174 -0.5645100 -0.4467286 -0.3243289 -0.5224844 -0.7846174 -0.9494620 0.3957065 0.1200013 0.0834169
-0.4119485 -0.4872402 0.4013236 -0.2723291 -0.0405174 0.5086207 0.5870610 -0.1775851 -0.5224844 -0.7846174 -0.9494620 0.3954874 -0.4149332 0.6705582
-0.4072331 -0.4872402 0.4013236 3.6647712 -0.0405174 -0.4748452 0.8961321 -0.4301365 -0.5224844 -0.7846174 -0.9494620 0.4406159 0.7375566 -0.1122969
-0.4068192 -0.4872402 0.4013236 3.6647712 -0.0405174 0.1257664 0.8463965 -0.2050342 -0.5224844 -0.7846174 -0.9494620 0.4060028 -0.3015046 0.0507979
-0.3784708 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.9484050 0.7078474 -0.4432437 -0.1779443 -0.6006817 -0.4875567 0.3836576 -0.4121325 0.4530985
-0.3727021 -0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509
-0.3476077 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824
-0.3486378 -0.4872402 -0.7196100 -0.2723291 -0.4115983 0.4744627 0.4343017 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.4406159 -0.7076067 0.8227800
-0.3834420 -0.4872402 -0.7196100 -0.2723291 -0.4374877 2.8199790 0.3454882 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.3108167 -1.1921285 2.4211092
-0.3588418 -0.4872402 -0.7196100 -0.2723291 -0.4374877 3.4732509 0.5124576 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.2774085 -1.1235113 2.9865046
-0.3756748 -0.4872402 -0.7196100 -0.2723291 -0.4374877 2.4983245 0.6367966 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.3363384 -1.3335641 1.6382541
-0.3721591 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.2501340 0.4023288 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.1687496 -0.8812504 0.9858749
-0.3854347 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.9944939 -1.8322198 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2282272 -1.2229363 2.6276960
-0.3687411 -0.4872402 -0.7196100 -0.2723291 -0.4374877 0.3805282 -1.6759080 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2592256 -1.2453419 0.9750019
-0.3576710 -0.4872402 -0.7196100 -0.2723291 -0.4374877 -0.4321477 -0.0168711 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2374281 -0.1404642 0.1921468
-0.3662788 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478
-0.3532195 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466
-0.3815656 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638
-0.3680285 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.6281737 -0.0737117 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.0386218 -0.6445909 0.7031772
-0.3816842 -0.4872402 -0.7196100 -0.2723291 -0.4115983 -0.2827064 -0.2513388 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2199025 -0.2482913 0.1595278
-0.3595800 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.4929649 0.2815424 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3480587 -0.4359384 0.2791307
-0.3605973 -0.4872402 -0.7196100 -0.2723291 -0.4115983 1.5276678 0.1074679 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3658034 -1.1095078 0.9750019
-0.4105174 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.2794774 -1.7789317 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2490389 -0.8812504 0.1269088
-0.4093455 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169
-0.4069308 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.8715495 -0.5071218 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.3787285 -0.1782737 -0.0579320
-0.4077644 0.7990739 -0.9047317 -0.2723291 -1.0933517 -0.2698972 -0.1234473 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.4156419 -0.0354378 -0.2645187
-0.4081387 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1044176 -0.5568574 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1760884 -0.2006793 -0.0361860
-0.4052706 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1542314 -2.1590536 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1975573 -1.0450916 0.1269088
-0.3961432 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434
-0.3978580 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9672898 0.0577323 1.9755128 -0.2927910 -0.4642132 0.2976825 0.3555071 0.8131756 -0.4384865
-0.3805937 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.2513949 -1.1963149 2.0232877 -0.2927910 -0.4642132 0.2976825 0.3670082 -0.4891518 0.1921468
-0.3972488 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.0834514 0.3774611 2.0232877 -0.2927910 -0.4642132 0.2976825 0.2132208 -0.3505170 -0.2210268
-0.4009900 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927
-0.3979278 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.6167877 -1.8144571 1.9145357 -0.2927910 -0.4642132 0.2976825 0.4060028 -0.8532433 0.3987335
-0.4037907 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197
-0.3952120 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2182776 -2.1199757 1.7104241 -0.2927910 -0.4642132 0.2976825 0.2234076 -1.2691479 0.2465117
-0.4105441 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.9569445 -2.1945790 2.4275218 -0.2927910 -0.4642132 0.2976825 0.3222084 -1.2775500 0.7684151
-0.3772094 0.4560568 -0.7691701 -0.2723291 -1.0674624 2.8100163 -2.1377384 2.4275218 -0.2927910 -0.4642132 0.2976825 0.4406159 -1.2761497 2.2036495
-0.4144992 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.2513949 -1.2993386 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.3966923 -0.8518430 -0.0688050
-0.4159768 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348
-0.4183136 3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253
-0.3490052 0.3703025 -1.0446662 -0.2723291 0.7965722 3.4433626 0.6510068 -0.9469692 -0.5224844 -0.8558183 -2.5199404 0.3617506 -1.0548940 2.9865046
-0.3429632 0.3703025 -1.0446662 -0.2723291 0.7965722 1.4920866 1.1163897 -0.9025187 -0.5224844 -0.8558183 -2.5199404 0.2915385 -0.6810000 1.4642863
-0.3437607 0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800
-0.3573095 0.3703025 -1.0446662 -0.2723291 0.7965722 1.3070641 0.4698271 -0.7992281 -0.5224844 -0.8558183 -2.5199404 0.3957065 -0.4289367 1.2250806
-0.3580059 0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684
-0.3596311 0.3703025 -1.0446662 -0.2723291 0.7965722 3.0078481 0.8144237 -0.7154559 -0.5224844 -0.8558183 -2.5199404 0.3306426 -0.9442662 2.8560287
-0.3241585 0.3703025 -1.0446662 -0.2723291 0.7965722 1.4835471 0.9209999 -0.8150422 -0.5224844 -0.8558183 -2.5199404 0.4024977 -0.1964782 0.9206369
-0.3561515 0.3703025 -1.0446662 -0.2723291 0.7965722 1.3113338 0.8179762 -0.8856597 -0.5224844 -0.8558183 -2.5199404 0.3419247 -0.6375891 1.5186513
-0.3315571 0.3703025 -1.0446662 -0.2723291 0.7965722 -1.0313360 -0.2051558 -0.8588754 -0.5224844 -0.8558183 -2.5199404 0.3913251 -0.3085064 0.0290519
-0.3287576 0.3703025 -1.0446662 -0.2723291 0.7965722 1.0380698 0.5692983 -0.7893502 -0.5224844 -0.8558183 -2.5199404 0.3000823 0.2992464 0.8880180
-0.3528649 0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046
-0.3572641 0.3703025 -1.0446662 -0.2723291 0.1752274 1.6870719 -0.5675150 -0.4383522 -0.5224844 -0.8558183 -2.5199404 0.3683227 -1.3293630 2.2797604
-0.4095629 0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808
-0.3853219 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889
-0.4012551 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.0635259 -1.8570876 0.3010658 -0.7521778 -1.0990882 0.0667298 0.4406159 -0.8490423 0.2900036
-0.4067785 0.3703025 -0.6088285 -0.2723291 -0.7826793 0.3606027 -0.3508100 0.0581549 -0.7521778 -1.0990882 0.0667298 0.4193661 -0.6894022 0.2030197
-0.3943063 0.3703025 -0.6088285 3.6647712 -0.7826793 2.0016102 -0.5959353 0.2713846 -0.7521778 -1.0990882 0.0667298 0.3734708 -0.8504426 1.3773024
-0.4135401 1.2278453 -0.6889993 3.6647712 -0.9293857 0.6737177 -1.2673657 0.1341862 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.2775500 1.0728588
-0.4089362 1.2278453 -0.6889993 -0.2723291 -0.9293857 0.8103497 -0.9156641 0.2242746 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.3545694 1.0293668
-0.4079306 1.2278453 -0.6889993 3.6647712 -0.9293857 1.3981521 -0.6954065 0.4711747 -0.6373311 -0.9151524 -0.3951756 0.3568215 -0.9246613 1.1598427
-0.4129785 1.2278453 -0.6889993 3.6647712 -0.9293857 0.7704987 -1.4556504 0.5070771 -0.6373311 -0.9151524 -0.3951756 0.4028263 -1.1893278 1.1489697
-0.4108266 1.2278453 -0.6889993 -0.2723291 -0.9293857 0.2809007 -1.2957860 0.1639624 -0.6373311 -0.9151524 -0.3951756 0.4406159 -0.7650212 0.7140502
-0.3956433 0.3703025 -1.1379558 -0.2723291 -0.9647679 0.7505732 -1.2922335 0.1451564 -0.5224844 -1.1406221 -1.6423201 0.4406159 -1.0927035 1.3664294
-0.4159420 0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472
-0.4157943 0.3703025 -1.1379558 -0.2723291 -0.9647679 0.9726003 -1.1146064 0.6884411 -0.5224844 -1.1406221 -1.6423201 0.3894630 -1.1291127 1.3990484
-0.4129762 0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851
-0.4183566 3.3717021 -1.4469778 3.6647712 -1.3263561 2.3318042 -1.5551216 0.9925190 -0.9818712 -1.2474235 -2.2427971 0.4255000 -1.3293630 2.9865046
-0.4190484 3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128
-0.4188275 1.8710023 -1.2953821 -0.2723291 -1.4299136 0.2396264 -1.3028911 1.6679681 -0.9818712 -0.6422155 -1.4575580 0.4167372 -0.6193846 -0.0579320
-0.4178172 2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187
-0.4156013 1.7638095 -0.8478833 -0.2723291 -1.2918369 -0.1076467 -1.3242064 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7720229 0.0725439
-0.4147654 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.0432179 -0.8161929 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7076067 -0.0253130
-0.4151061 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117
-0.4160303 2.9429307 -0.9018164 -0.2723291 -1.2400582 0.8203125 -1.4449928 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.3055571 0.6488122
-0.4109336 2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352
-0.4158989 2.9429307 -0.9018164 -0.2723291 -1.2400582 0.4915417 -1.6048571 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.1137088 0.5835743
-0.4104929 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.2243532 -1.7824842 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.5703722 0.1486548
-0.4105697 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.3922967 -0.9334268 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.3155082 -0.0905509
-0.4050672 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852
-0.4138563 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.3762584 -0.6243557 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.3962541 -0.7370141 0.4965904
-0.4037058 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728
-0.4125844 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670
-0.4136366 2.5141594 -1.2968398 -0.2723291 -1.3349859 1.0764975 -2.0808977 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1632728 -1.1081074 0.7031772
-0.4149666 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117
-0.4159896 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320
-0.4093292 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.2994029 -1.7824842 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.2950436 -0.5577691 0.4204795
-0.4084759 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.9925258 -1.8073520 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.3697466 -1.0913032 1.1489697
-0.4136900 0.9277053 -1.3055857 -0.2723291 -0.7136410 1.3540313 -0.9760573 0.1077818 -0.2927910 -1.1050216 -0.0256513 0.4053456 -0.8014303 1.4751593
-0.4137319 0.9277053 -1.3055857 -0.2723291 -0.7136410 0.4716162 -0.3721252 -0.2018524 -0.2927910 -1.1050216 -0.0256513 0.4018404 -0.5213599 0.6379392
-0.4113788 0.9277053 -1.3055857 -0.2723291 -0.7136410 1.6159094 0.1181255 -0.3304551 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.8658465 1.1815886
-0.4143678 0.9277053 -1.3055857 -0.2723291 -0.7136410 0.8032335 0.0612849 -0.2908010 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.7174092 0.6161933
-0.3627887 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519
-0.3794811 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728
-0.1137056 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -1.8667839 -1.0932912 -0.6058017 -0.6373311 -0.6184819 -0.0256513 -0.0681750 -0.0018293 -0.6994382
-0.3282101 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2314694 -0.5604099 -0.5483863 -0.6373311 -0.6184819 -0.0256513 0.4406159 -0.9344638 -0.0470590
-0.3896781 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296
-0.3887841 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239
-0.3771792 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818
-0.3906233 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.8249648 0.3241729 0.0712146 -0.6373311 -0.6184819 -0.0256513 0.4353582 -0.1614694 -0.6885653
-0.3831002 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.5275055 0.5195627 0.0966692 -0.6373311 -0.6184819 -0.0256513 0.3727041 0.7949710 -0.5145975
-0.3915928 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.7153745 0.1110204 0.1123884 -0.6373311 -0.6184819 -0.0256513 0.4406159 0.4602869 -0.2971377
-0.3733636 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709
-0.3648244 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618
-0.4006168 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2011986 -0.5781726 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.7636208 0.1377818
-0.3989904 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1300361 -0.5071218 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.8098324 0.0616709
-0.3792788 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.3467527 -0.6634336 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.6936032 -0.2318998
-0.3870937 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.8206950 0.2033865 0.4397839 -0.5224844 -0.7193499 0.5286352 0.3774141 -0.1278610 -0.4384865
-0.3804472 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1855429 -1.0115827 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9148588 0.2682577
-0.3977964 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2083149 -1.9139283 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4053456 -1.0604955 0.2247657
-0.3848208 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.0389481 -1.4094674 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9106578 0.0507979
-0.3920800 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860
-0.4124089 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026
-0.4122845 -0.4872402 -1.1510747 -0.2723291 -0.8171985 0.0688364 -1.8251147 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2030341 -0.7440159 0.0073060
-0.4148189 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.2001579 -1.2922335 0.9871052 -0.6373311 0.1291279 -0.7185093 0.1303027 -0.4989543 -0.2971377
-0.4142620 1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.8235415 -1.4272301 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.4090698 -0.0312367 -0.5907084
-0.4160722 1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.3609852 -1.6084097 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.0610765 -0.6753986 -0.3406296
-0.4141923 -0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0446411 -1.0826335 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3618601 -0.9764743 -0.0361860
-0.4157560 -0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0361016 -1.0684234 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3584645 -0.8266367 -0.1992808
-0.4154967 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3524457 -1.2105250 1.0401514 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.6501923 -0.1557889
-0.4161175 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567
-0.4165663 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865
-0.4162582 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3211342 -1.1110539 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4322912 -0.5801747 -0.2101538
-0.4137110 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216
-0.4129506 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406
-0.4185892 1.0134596 -1.4017907 -0.2723291 -0.9725347 1.3611476 -0.6847489 1.5400303 -0.9818712 -0.7371501 -1.3651769 0.4169563 -1.0030810 1.1054777
-0.4171976 -0.4872402 -1.3478576 -0.2723291 -0.3166707 0.3634492 -0.3152846 1.1738830 -0.9818712 0.0816606 -1.1804147 0.3645985 -0.5605698 -0.6559463
-0.4171452 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.5854762 -0.4325184 0.9199069 -0.5224844 -0.2268768 -0.3951756 0.4406159 -0.7664215 0.1486548
-0.4165570 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829
-0.4164826 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.3851804 -0.7131692 2.0033894 -0.7521778 -0.3336782 0.1591109 0.3172793 -0.2973036 -0.5472164
-0.4129379 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354
-0.4179277 3.1573164 -1.0184285 -0.2723291 -1.0847220 0.3292912 -1.4520979 2.2511443 -0.6373311 -0.3396116 -0.2566040 0.3916537 -0.8812504 0.0616709
-0.4183566 2.9429307 -1.3303658 -0.2723291 -1.0329432 0.4986579 -1.3810470 2.1602961 -0.6373311 -0.7608838 -0.6723188 0.3753329 -0.9330634 0.2138927
-0.4167314 1.2278453 -1.4411472 -0.2723291 -1.0847220 0.9313260 -1.2105250 2.3730984 -0.9818712 -0.4345461 0.5748257 0.3633936 -0.9470669 0.4422255
-0.4128809 1.2278453 -1.4411472 -0.2723291 -1.0847220 0.2922867 -0.8588234 2.3730984 -0.9818712 -0.4345461 0.5748257 0.4406159 -0.9344638 0.0399249
-0.4108592 2.0853880 -1.3770106 -0.2723291 -1.2400582 0.4189559 -1.1607894 3.2840500 -0.6373311 0.0163931 -0.0718418 0.1545100 -1.0030810 0.1704008
-0.4116799 2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135
-0.4181148 3.3717021 -1.3289081 -0.2723291 -1.2486880 0.6310202 -1.1536844 3.9566022 -0.5224844 -1.3126910 -0.6723188 0.3043541 -1.1417159 0.8227800
-0.4151014 2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.8847413 -1.6581453 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2861713 -0.6445909 -0.4711055
-0.4077097 2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.4961940 -1.7434063 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2121255 -0.9918782 -0.2101538
0.6242409 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975
0.0274574 -0.4872402 1.0149946 3.6647712 1.8580364 0.1570779 0.7966610 -0.6125452 1.6596029 1.5294129 0.8057784 0.3797143 0.0863929 -0.0905509
0.1846466 -0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789
0.0753105 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060
0.1079337 -0.4872402 1.0149946 -0.2723291 1.8580364 0.1613476 0.6900847 -0.6063715 1.6596029 1.5294129 0.8057784 0.1959143 -0.6810000 0.2682577
0.0259624 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.0478701 0.8002135 -0.7121316 1.6596029 1.5294129 0.8057784 -0.0659843 0.2152253 -0.2862647
0.0075215 -0.4872402 1.0149946 -0.2723291 1.8580364 -1.3131396 0.9813931 -0.8032647 1.6596029 1.5294129 0.8057784 0.2641547 -0.3449156 -0.1884078
0.0707857 -0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273
-0.0161882 -0.4872402 1.0149946 3.6647712 1.4092873 3.5515296 0.5089051 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0232656 -1.0310881 -0.0688050
0.1095555 -0.4872402 1.0149946 -0.2723291 1.4092873 -3.8764132 0.6865322 -1.0361553 1.6596029 1.5294129 0.8057784 -0.0216226 -0.7748236 0.5400824
0.0096990 -0.4872402 1.0149946 -0.2723291 1.4092873 -1.8810164 0.8108711 -0.9700968 1.6596029 1.5294129 0.8057784 -0.4451952 0.1886186 -0.0688050
1.1519647 -0.4872402 1.0149946 -0.2723291 0.6584956 -3.4465917 1.1163897 -1.0848799 1.6596029 1.5294129 0.8057784 -2.4673242 0.0947950 0.0616709
0.1493565 -0.4872402 1.0149946 -0.2723291 0.6584956 -1.8710537 1.1163897 -1.1694595 1.6596029 1.5294129 0.8057784 0.2064297 -1.3153595 2.9865046
0.2390799 -0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046
0.3400827 -0.4872402 1.0149946 3.6647712 0.6584956 1.0409163 1.0275762 -1.2312439 1.6596029 1.5294129 0.8057784 0.3874913 -1.3573701 2.9865046
0.6532287 -0.4872402 1.0149946 -0.2723291 0.6584956 -0.0976839 1.1163897 -1.2470580 1.6596029 1.5294129 0.8057784 0.1037952 -0.4373388 2.9865046
0.5410338 -0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046
0.8713058 -0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170
1.7304654 -0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170
1.8596166 -0.4872402 1.0149946 -0.2723291 1.0036872 1.4636216 1.0417863 -1.1771529 1.6596029 1.5294129 0.8057784 0.4406159 0.1101988 -0.8190411
1.3572534 -0.4872402 1.0149946 -0.2723291 1.0036872 0.5185834 0.8783694 -1.1635707 1.6596029 1.5294129 0.8057784 0.0695107 1.4825438 -0.9386440
0.7219594 -0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819
2.3291951 -0.4872402 1.0149946 -0.2723291 1.0036872 0.1357291 0.9813931 -1.1440049 1.6596029 1.5294129 0.8057784 0.4406159 1.5455597 -1.0256279
1.6570484 -0.4872402 1.0149946 -0.2723291 1.0036872 -0.0877212 1.1163897 -1.1440049 1.6596029 1.5294129 0.8057784 0.4060028 1.2780924 -1.3409445
9.9241096 -0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985
1.4254272 -0.4872402 1.0149946 -0.2723291 1.0036872 0.3705654 1.0844168 -1.0807958 1.6596029 1.5294129 0.8057784 0.4406159 1.1800678 -1.2648336
0.6479646 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417
0.5090895 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0882661 1.1163897 -1.0741947 1.6596029 1.5294129 0.8057784 0.4406159 1.6673903 -1.1126118
1.9149323 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.7278503 0.8037660 -1.1186453 1.6596029 1.5294129 0.8057784 -0.7759914 2.5174040 -1.4931663
1.5344076 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.4341159 1.0488914 -1.1250089 1.6596029 1.5294129 0.8057784 0.4406159 2.5426103 -1.6671342
2.4158772 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256
2.2069961 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.8283562 0.7433728 -1.0811757 1.6596029 1.5294129 0.8057784 0.4406159 2.7078519 -1.6453882
1.2463082 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.9991462 1.1163897 -1.0474104 1.6596029 1.5294129 0.8057784 0.1779505 2.5160036 -1.3409445
0.5276048 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.2732886 1.0773117 -0.9815894 1.6596029 1.5294129 0.8057784 0.4406159 1.1478597 -1.1995957
0.3893052 -0.4872402 1.0149946 -0.2723291 1.2539511 -0.8135788 1.0098134 -0.8873694 1.6596029 1.5294129 0.8057784 0.4135607 0.6241280 -0.8081681
0.1952587 -0.4872402 1.0149946 -0.2723291 1.2539511 -0.3325202 0.4946949 -0.7727762 1.6596029 1.5294129 0.8057784 0.2377567 0.8551861 0.0725439
0.9259239 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095
0.5849224 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.1304187 0.8535016 -0.9516232 1.6596029 1.5294129 0.8057784 0.4406159 0.3524598 -0.9495170
1.1330844 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.5659332 0.9281050 -0.9559448 1.6596029 1.5294129 0.8057784 0.4406159 0.5177013 -1.0691198
0.5932918 -0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279
0.2625722 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1713104 0.9742880 -1.0059517 1.6596029 1.5294129 0.8057784 0.4406159 0.9406076 -1.0908658
0.4718334 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853
4.0386089 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.1836238 1.1163897 -1.0948528 1.6596029 1.5294129 0.8057784 0.4406159 2.5118026 -1.9063399
0.7327784 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.6157470 0.3277255 -1.0897239 1.6596029 1.5294129 0.8057784 -0.2027938 2.4249808 -1.7649910
2.4917124 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.4236082 1.1163897 -1.0477428 1.6596029 1.5294129 0.8057784 0.4406159 1.9768681 -1.8411020
1.2349731 -0.4872402 1.0149946 -0.2723291 1.1935426 0.0830689 1.1163897 -1.0547238 1.6596029 1.5294129 0.8057784 0.4406159 1.0736410 -1.6671342
0.6954781 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578
2.4632989 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313
4.4080076 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.0726103 0.5977186 -1.0389097 1.6596029 1.5294129 0.8057784 -0.2980894 2.0622896 -1.5257853
7.4762471 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399
1.9883261 -0.4872402 1.0149946 -0.2723291 0.9001297 -3.0551979 1.1163897 -1.2427839 1.6596029 1.5294129 0.8057784 0.1483760 1.4965474 -1.1561037
0.9693115 -0.4872402 1.0149946 -0.2723291 0.9001297 -0.9630200 1.1163897 -1.1919222 1.6596029 1.5294129 0.8057784 -0.2692816 -0.0732473 0.5835743
0.4406611 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354
1.2584688 -0.4872402 1.0149946 -0.2723291 0.3650828 0.8075032 1.1163897 -1.1062979 1.6596029 1.5294129 0.8057784 -1.9422126 0.9980220 0.5400824
5.5248535 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.7509558 1.1163897 -1.1312301 1.6596029 1.5294129 0.8057784 -3.8783565 -0.3561184 -0.8190411
1.2134072 -0.4872402 1.0149946 -0.2723291 0.3650828 0.5299694 1.1163897 -1.0768541 1.6596029 1.5294129 0.8057784 -3.5229148 1.1996727 -0.5798354
1.7668310 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245
2.9113695 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923
4.8982568 -0.4872402 1.0149946 -0.2723291 1.1935426 -2.5129395 1.1163897 -1.0147848 1.6596029 1.5294129 0.8057784 -2.9399686 3.4066275 -1.6888801
1.6823810 -0.4872402 1.0149946 -0.2723291 1.0727255 0.2125846 1.1163897 -0.9309651 1.6596029 1.5294129 0.8057784 -3.6083523 2.2961484 -1.6671342
0.8394627 -0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152
2.5957053 -0.4872402 1.0149946 -0.2723291 1.0727255 -1.3956881 0.7291627 -1.0198662 1.6596029 1.5294129 0.8057784 -2.5117955 1.9586635 -1.3191985
8.1288391 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663
0.9531748 -0.4872402 1.0149946 -0.2723291 1.4092873 0.7676522 0.2815424 -0.9502935 1.6596029 1.5294129 0.8057784 -3.3761377 1.4125262 -1.5366583
0.8688993 -0.4872402 1.0149946 -0.2723291 1.4092873 0.1798499 1.1163897 -0.9194726 1.6596029 1.5294129 0.8057784 -0.4154016 0.3314545 -0.6342003
0.3963319 -0.4872402 1.0149946 -0.2723291 1.4092873 -0.3965665 0.9494202 -0.9120166 1.6596029 1.5294129 0.8057784 -0.4019288 0.4266784 -0.9060250
0.9806002 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078
0.3995673 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.2585112 0.5870610 -0.8421115 1.6596029 1.5294129 0.8057784 -3.8792328 1.4895456 -0.9930089
0.6020542 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497
1.4237880 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313
1.0037355 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445
3.9584024 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.1176094 0.3596983 -0.9175730 1.6596029 1.5294129 0.8057784 -3.7006904 0.2614369 -1.2648336
0.4363851 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.1304187 0.3383831 -0.8830478 1.6596029 1.5294129 0.8057784 -2.8473018 1.2416833 -1.2539606
0.6656207 -0.4872402 1.0149946 -0.2723291 1.0727255 0.1357291 0.9600779 -0.8675661 1.6596029 1.5294129 0.8057784 -3.2417380 1.6001734 -1.4170554
0.5671779 -0.4872402 1.0149946 -0.2723291 0.2528955 0.0901851 0.6225864 -0.8274371 1.6596029 1.5294129 0.8057784 -2.9927645 0.6983467 -0.8734060
0.7497230 -0.4872402 1.0149946 -0.2723291 0.2528955 0.7804615 0.9138948 -0.8105782 1.6596029 1.5294129 0.8057784 -3.0159860 0.9854189 -0.9168980
0.3290719 -0.4872402 1.0149946 -0.2723291 0.2528955 0.1997754 0.2211492 -0.7572945 1.6596029 1.5294129 0.8057784 -2.8339385 -0.0872508 -0.6994382
0.2287434 -0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520
1.1974449 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497
0.8773861 -0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089
1.2564343 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2510124 0.8783694 -0.8512296 1.6596029 1.5294129 0.8057784 -3.6057234 0.7557611 -1.4061824
1.3443720 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.1887719 1.1163897 -0.8932106 1.6596029 1.5294129 0.8057784 -3.8047489 1.9320568 -1.5040393
1.1700894 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.4976172 0.6865322 -0.9376612 1.6596029 1.5294129 0.8057784 -3.1515905 2.9921233 -1.5366583
0.6716359 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.9359783 0.8996847 -0.9392759 1.6596029 1.5294129 0.8057784 0.4406159 1.4321312 -1.0582468
2.1435191 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6641375 0.8463965 -0.9160058 1.6596029 1.5294129 0.8057784 0.3809192 1.3243041 -1.3083256
0.7104138 -0.4872402 1.0149946 -0.2723291 1.5991427 0.1727336 1.0169185 -0.8215484 1.6596029 1.5294129 0.8057784 0.3207844 0.9616129 -0.5907084
0.2386602 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.0934142 1.1163897 -0.8501848 1.6596029 1.5294129 0.8057784 0.4273621 0.5513097 -0.4493595
0.7385901 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2851704 1.1163897 -0.8627221 1.6596029 1.5294129 0.8057784 0.3292186 0.8677893 -0.7755492
1.0682704 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066
0.8205824 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767
0.3109379 -0.4872402 1.0149946 -0.2723291 1.5991427 0.0802224 0.9884982 -0.8182715 1.6596029 1.5294129 0.8057784 -0.4235072 0.7193520 -0.8299141
0.7337433 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.0478701 0.9956033 -0.7584343 1.6596029 1.5294129 0.8057784 0.3488254 0.5303045 -1.0799928
0.6644814 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980
0.4548586 -0.4872402 1.0149946 -0.2723291 1.3661384 0.1883894 1.0559965 -0.7646079 1.6596029 1.5294129 0.8057784 -0.5746657 0.9322055 -1.0365009
0.3608882 -0.4872402 1.0149946 -0.2723291 1.3661384 0.6609085 0.8535016 -0.6987869 1.6596029 1.5294129 0.8057784 -3.9033305 0.6703397 -0.9930089
0.2124754 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951
0.1716722 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382
0.5388063 -0.4872402 1.0149946 -0.2723291 1.3661384 1.5774816 1.0915219 -0.6374774 1.6596029 1.5294129 0.8057784 0.2102634 0.5723150 -0.5145975
0.6859357 -0.4872402 1.0149946 -0.2723291 1.3661384 0.6310202 0.9067897 -0.6168668 1.6596029 1.5294129 0.8057784 -3.8336662 0.8481844 -0.8299141
0.1324002 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980
0.1226880 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198
0.5332828 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359
0.4811585 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0232924 0.5373254 -0.4805707 1.6596029 1.5294129 0.8057784 -0.9251783 0.5008971 -0.8299141
0.3705900 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917
0.1393478 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5925924 0.7611355 -0.5687120 1.6596029 1.5294129 0.8057784 -1.1111691 0.5275038 -0.6668193
0.0092526 -0.4872402 1.0149946 -0.2723291 1.3661384 0.1300361 0.7042949 -0.5831490 1.6596029 1.5294129 0.8057784 0.3807001 0.2796414 -0.5254704
0.3535872 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567
0.2566546 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3250214 0.7575830 -0.4717851 1.6596029 1.5294129 0.8057784 0.4068791 -0.3309120 -0.2536457
0.4912834 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.1076467 -0.1127897 -0.3949464 1.6596029 1.5294129 0.8057784 0.4406159 0.0793911 -0.1231699
-0.0523073 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.7481093 -0.7238268 -0.3459843 1.6596029 1.5294129 0.8057784 -0.2439790 0.2068231 -0.2862647
0.0187706 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.4734220 0.5728508 -0.4385897 1.6596029 1.5294129 0.8057784 -3.6657487 0.6297295 -0.3841216
0.0940246 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486
1.3907009 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486
1.0999857 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.8135788 -0.4218608 -0.4612898 1.6596029 1.5294129 0.8057784 0.4406159 0.2950453 -0.2645187
0.0854807 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647
0.0493965 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0791817 0.7860033 -0.3304076 1.6596029 1.5294129 0.8057784 0.4234189 0.0303788 -0.3188837
-0.0052134 -0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439
0.1201373 -0.4872402 1.0149946 -0.2723291 0.5117892 0.9896793 -0.0346338 -0.5993905 1.6596029 1.5294129 0.8057784 0.1972287 -0.1390638 0.7901611
0.5164498 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.2206283 0.9529728 -0.6483526 1.6596029 1.5294129 0.8057784 -0.0448441 0.7683643 -0.9495170
0.3231508 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.1745394 1.0240236 -0.7546351 1.6596029 1.5294129 0.8057784 -0.5905484 1.6029741 -1.0038819
0.1462396 -0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003
1.3264915 -0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307
0.7695683 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.1418047 0.9991558 -0.7714940 1.6596029 1.5294129 0.8057784 0.2522154 0.7529604 -0.8625331
1.2463082 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.0791817 0.6900847 -0.8756394 1.6596029 1.5294129 0.8057784 0.2918671 0.0639872 -0.1231699
0.2569871 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0606794 -0.1376575 -0.1761129 1.6596029 1.5294129 0.8057784 0.4406159 -0.2678962 0.0507979
0.2435210 -0.4872402 1.0149946 -0.2723291 -0.1958536 0.6623317 0.2247018 -0.2200411 1.6596029 1.5294129 0.8057784 0.3986639 -0.6880018 0.1269088
0.2461926 -0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577
-0.0924419 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.7438395 -1.0044776 0.1440166 1.6596029 1.5294129 0.8057784 0.3970209 -0.3127075 -0.0796779
-0.1435735 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538
0.0069925 -0.4872402 1.0149946 -0.2723291 0.2442657 0.0389481 -0.5923828 0.0933924 1.6596029 1.5294129 0.8057784 0.3499208 -0.2903018 -0.1449159
0.2416108 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486
0.1420845 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5403147 -0.5461998 -0.3052380 1.6596029 1.5294129 0.8057784 0.3455394 -0.1684712 -0.2101538
-0.4025630 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951
-0.3987834 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801
-0.3959828 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773
-0.4078085 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 1.0737592 -0.9151035 -0.6373311 1.7964164 0.7595879 0.3662415 0.7585618 -0.9712629
-0.4071598 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187
-0.3999530 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8221183 -0.5177794 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 -0.0900515 -0.0796779
-0.3875994 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927
-0.3992926 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8747785 -1.4130199 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4010737 0.6927453 0.0616709
-0.3864333 -0.4872402 -0.2108898 -0.2723291 0.2615253 -1.2732886 0.1536509 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4406159 1.1884699 -0.3080107
-0.3889003 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325
-0.3923020 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159
-0.3994275 -0.4872402 -0.2108898 -0.2723291 0.2615253 -1.0185268 0.1749662 -0.6625521 -0.4076377 -0.1022751 0.3438730 0.4282384 0.3426573 -0.5472164
-0.3940157 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3666782 0.3952238 -0.6158695 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2348302 -0.6233273
-0.4128204 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400
-0.4148387 -0.4872402 0.1156240 -0.2723291 0.1579678 -0.2343159 0.2886475 -0.7159308 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.5003546 -0.2101538
-0.4130378 -0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548
-0.4073609 -0.4872402 0.1156240 -0.2723291 0.1579678 0.7249547 0.7362677 -0.6677760 -0.9818712 -0.8024176 1.1753027 0.4028263 -0.8644462 -0.0579320
-0.4145899 -0.4872402 0.1156240 -0.2723291 0.1579678 -0.3624084 0.4343017 -0.6126402 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.6683969 -1.1561037

1.3.2. Create a categorical variable (crime rate)

# summary of the scaled crime rate
summary(boston_scaled$crim)
##      Min.   1st Qu.    Median      Mean   3rd Qu.      Max. 
## -0.419367 -0.410563 -0.390280  0.000000  0.007389  9.924110
# create a quantile vector of crim and print it
bins <- quantile(boston_scaled$crim)
knitr::kable(bins, caption="Quantiles of crim") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Quantiles of crim
x
0% -0.4193669
25% -0.4105633
50% -0.3902803
75% 0.0073892
100% 9.9241096
# create a categorical variable 'crime'
crime <- cut(boston_scaled$crim, breaks = bins, include.lowest = TRUE, labels = c("low", "med_low", "med_high", "high"))

# look at the table of the new factor crime
knitr::kable(table(crime), caption="Categorical variables of crime") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Categorical variables of crime
crime Freq
low 127
med_low 126
med_high 126
high 127
# remove original crim from the dataset
boston_scaled <- dplyr::select(boston_scaled, -crim)

# add the new categorical value to scaled data
boston_scaled <- data.frame(boston_scaled, crime)

knitr::kable(boston_scaled, caption = "Scaled Boston data set with caterogical variable crime") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Scaled Boston data set with caterogical variable crime
zn indus chas nox rm age dis rad tax ptratio black lstat medv crime
0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.1400750 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278 low
-0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239 low
-0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582 low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.1603069 0.9778406 1.0236249 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.9097999 0.4965904 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.9302853 1.1163897 1.0861216 -0.5224844 -0.5769480 -1.5037485 0.3281233 2.4193794 -0.6559463 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3994130 0.6154813 1.3283202 -0.5224844 -0.5769480 -1.5037485 0.3289995 0.6227277 -0.3949946 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 0.1314594 0.9138948 1.2117800 -0.5224844 -0.5769480 -1.5037485 0.3926395 1.0918456 -0.8190411 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3922967 0.5089051 1.1547920 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.0863929 -0.3949946 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.5630867 -1.0506607 0.7863653 -0.5224844 -0.5769480 -1.5037485 0.3705134 0.4280788 -0.0905509 med_low
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4776917 -0.2406812 0.4333252 -0.6373311 -0.6006817 1.1753027 0.4406159 -0.6151835 -0.2318998 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6413655 -0.4289659 0.3341188 -0.6373311 -0.6006817 1.1753027 0.4265954 -0.5857761 -0.2862647 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4976172 -1.3952572 0.3341188 -0.6373311 -0.6006817 1.1753027 0.3305330 -0.8504426 0.0616709 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4193385 0.4662746 0.2198106 -0.6373311 -0.6006817 1.1753027 0.3294377 0.2824421 -0.5472164 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.0171035 1.0488914 0.0013569 -0.6373311 -0.6006817 1.1753027 0.2179309 1.1716657 -0.9712629 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4549197 0.7327152 0.1031753 -0.6373311 -0.6006817 1.1753027 0.3927490 0.1648126 -0.3188837 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.5132730 0.9067897 0.2871038 -0.6373311 -0.6006817 1.1753027 0.4124654 0.5106995 -0.7538032 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.9758293 0.6083763 0.3132232 -0.6373311 -0.6006817 1.1753027 -0.5833190 0.5401069 -0.9386440 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 0.7717932 0.4212153 -0.6373311 -0.6006817 1.1753027 0.2213265 0.3020471 -0.6450733 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3382132 0.7185050 0.3126533 -0.6373311 -0.6006817 1.1753027 -0.5508966 0.6479340 -0.8407871 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 0.2994029 0.9174474 0.3132707 -0.6373311 -0.6006817 1.1753027 0.3424724 0.0205763 -0.4493595 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 0.5541647 0.6652169 0.2108350 -0.6373311 -0.6006817 1.1753027 0.2580207 -0.0942525 -0.1666618 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3026319 1.1163897 0.1804414 -0.6373311 -0.6006817 1.1753027 0.2196834 0.0541848 -0.8734060 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4762685 0.4769322 0.0925851 -0.6373311 -0.6006817 1.1753027 -1.3590472 2.1085012 -1.0147549 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8306578 0.9387626 -0.0037245 -0.6373311 -0.6006817 1.1753027 0.0229582 0.7977717 -1.0256279 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359 med_high
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946 low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917 med_low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6185935 -0.9618471 0.0660857 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.5437656 -0.1666618 low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.4534965 -1.3632843 0.0248170 -0.5224844 -0.7668172 0.3438730 0.4026072 -0.3533177 0.2356387 med_low
2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910 low
2.7285450 -1.1933466 -0.2723291 -1.0933517 1.0523023 -1.8748503 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4265954 -1.4946046 1.3446835 low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 0.6907967 -2.3331282 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3147600 -1.0941039 0.4422255 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1048002 -2.2052367 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.4138893 -0.7300124 0.2356387 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3069017 -1.0151352 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3583550 -0.4345381 -0.1449159 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.8576995 -1.2353928 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 -0.3421149 -0.3515026 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3624084 0.6012712 0.8996287 -0.7521778 -1.0397541 -0.2566040 0.3950493 0.8607875 -0.6450733 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -1.2604793 0.9494202 0.9853955 -0.7521778 -1.0397541 -0.2566040 0.4406159 2.5426103 -0.8842790 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.9715595 -0.2335761 1.0887811 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.4966960 -0.3406296 med_low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4577662 -0.8126404 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4259382 0.1115992 -0.3080107 med_low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268 low
0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577 low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899 low
2.7285450 -1.0402932 -0.2723291 -1.2486880 -0.5645100 -0.7451421 1.6738568 -0.7521778 0.3605309 1.2214933 0.4406159 0.3006467 -0.3949946 low
3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484 low
3.1573164 -1.5154874 -0.2723291 -1.2486880 0.1399989 -1.1678945 2.5609210 -0.8670245 -0.5650812 -0.5337472 0.4406159 -0.9638712 0.2356387 low
3.8004735 -1.4309437 -0.2723291 -1.2400582 0.7562662 -0.9973725 2.1511780 -0.5224844 -0.9032856 -1.5499390 0.3968018 -1.2187352 0.9858749 low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.1987347 -1.3988097 1.9089794 -0.1779443 -0.7371501 0.5748257 0.3724850 -0.8112328 0.0834169 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.5090032 -0.7593522 1.4897384 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.4807497 -0.3188837 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.7737278 -0.0843694 1.6290739 -0.1779443 -0.7371501 0.5748257 0.4210091 0.0695886 -0.4167406 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 0.6794107 -0.8943488 1.9878602 -0.1779443 -0.7371501 0.5748257 0.4261573 -0.4415399 0.2682577 med_low
0.2631097 -1.4221978 -0.2723291 -1.1960462 1.1661623 -0.3223896 2.5776850 -0.7521778 -1.1406221 0.0667298 0.4005260 -0.6445909 1.1380967 low
2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628 low
2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296 low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320 low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.9829455 -1.1288166 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 0.0611865 -0.5580894 med_low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.4606127 -1.8144571 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2219837 -0.3883265 -0.0905509 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.3125947 -2.1590536 0.7086718 -0.6373311 -0.6125485 0.3438730 0.3750043 -0.9988800 0.0290519 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.0564097 -2.2158943 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2245030 -0.7160089 0.0942899 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0165586 -2.2229994 0.2167712 -0.5224844 -0.0607412 0.1129203 0.4189279 -0.8224356 0.1704008 low
-0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0080191 0.2104916 0.1221238 -0.5224844 -0.0607412 0.1129203 0.1860561 -0.0956529 -0.2753917 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0749119 -0.5284370 0.5789293 -0.5224844 -0.0607412 0.1129203 0.3256039 -0.0438399 -0.1449159 low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728 med_low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.4758859 0.0648374 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4267049 -0.7608201 0.1486548 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.0247156 -1.2922335 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -0.8308377 0.2465117 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 -0.1674232 -0.7771150 0.7625253 -0.6373311 -0.7549503 0.2514920 0.3720469 -0.7202099 0.0399249 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 0.4915417 -0.4431760 0.3051974 -0.7521778 -0.9566863 0.0205393 0.3902297 -0.8574444 0.4422255 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.2328927 -0.4183083 -0.0225305 -0.7521778 -0.9566863 0.0205393 0.4214472 -0.5899772 -0.0361860 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 1.0281070 0.6296915 -0.1773001 -0.8670245 -0.8202179 -0.3027945 0.4406159 -1.0016807 0.1160358 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 1.1305810 -0.1944981 -0.1807194 -0.8670245 -0.8202179 -0.3027945 0.4314149 -0.9736736 0.6705582 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1713104 0.1891763 -0.3338269 -0.8670245 -0.8202179 -0.3027945 0.4039216 -0.6235856 -0.0579320 low
0.7133196 0.5689534 -0.2723291 -0.7826793 0.2239706 -0.5319896 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4199137 -0.6291870 0.0399249 low
0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577 low
0.7133196 0.5689534 -0.2723291 -0.7826793 -0.0507166 0.3099628 -0.0855021 -0.6373311 -0.8202179 -0.1180323 0.4406159 -0.2889015 -0.2101538 low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 0.4844254 -0.3827828 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.0143049 -0.8406402 0.6379392 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 2.5395987 0.2637797 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -1.1823261 1.7578570 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 2.1852094 -1.1252640 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4037025 -1.2719486 2.3123794 low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427 low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.7064525 0.0968103 -0.4459031 -0.5224844 -0.1438090 1.1291122 0.4261573 -0.6978043 0.4313525 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.1713104 0.5977186 -0.5130539 -0.5224844 -0.1438090 1.1291122 -3.1313265 -0.2833001 -0.4276135 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2101207 0.6687695 -0.5130539 -0.5224844 -0.1438090 1.1291122 0.4139988 0.1101988 -0.3515026 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6171702 0.9991558 -0.8016976 -0.5224844 -0.1438090 1.1291122 0.4093984 0.5345055 -0.3297567 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.2695146 1.0133660 -0.6468804 -0.5224844 -0.1438090 1.1291122 0.4224331 -0.0536423 -0.2971377 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.0791817 0.8037660 -0.5935968 -0.5224844 -0.1438090 1.1291122 0.3785094 0.4056731 -0.3406296 med_high
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 0.6125180 0.4627220 -0.5307201 -0.4076377 0.1409947 -0.3027945 0.4262668 -0.3491166 0.0290519 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5289287 0.8641592 -0.6846349 -0.4076377 0.1409947 -0.3027945 0.4192565 0.4980964 -0.4058676 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.2741669 0.9529728 -0.5922195 -0.4076377 0.1409947 -0.3027945 0.4406159 0.6213273 -0.4167406 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.1546139 0.1394408 -0.5057404 -0.4076377 0.1409947 -0.3027945 0.4011832 -0.0858504 -0.1449159 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.5901285 0.0399696 -0.7300828 -0.8670245 -1.3067576 0.2976825 0.3557261 0.2404316 -0.0579320 low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.3994130 0.5515356 -0.7587192 -0.8670245 -1.3067576 0.2976825 0.2299797 0.2264281 -0.2427728 low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.5773192 0.9671829 -0.8494724 -0.8670245 -1.3067576 0.2976825 0.2487102 0.6899446 -0.4058676 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.4250315 0.7042949 -0.8558361 -0.8670245 -1.3067576 0.2976825 0.3104881 0.3020471 -0.1231699 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.9559038 0.9600779 -0.9677698 -0.8670245 -1.3067576 0.2976825 0.0286541 2.0454854 -0.7429302 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2083149 1.0737592 -0.9415079 -0.6373311 0.1706618 1.2676838 0.4406159 0.3832675 -0.4928515 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.9217458 0.9281050 -0.8620098 -0.6373311 0.1706618 1.2676838 0.4406159 0.7963713 -0.8951520 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2467426 1.0773117 -0.7961887 -0.6373311 0.1706618 1.2676838 0.4202424 -0.0074307 -0.3623756 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.6584445 0.9529728 -0.6293092 -0.6373311 0.1706618 1.2676838 0.3506875 0.3328548 -0.4493595 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.7509558 1.0595490 -0.6881492 -0.6373311 0.1706618 1.2676838 -1.0286891 0.6521351 -0.7538032 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.0716829 1.0524439 -0.7998930 -0.6373311 0.1706618 1.2676838 0.4161895 0.6031228 -0.4819785 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.4876545 0.8854745 -0.8681835 -0.6373311 0.1706618 1.2676838 0.2363328 0.5947207 -0.5580894 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2410496 1.0595490 -0.9237941 -0.6373311 0.1706618 1.2676838 0.4097270 0.2712393 -0.5907084 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.6086307 1.0524439 -1.0098459 -0.6373311 0.1706618 1.2676838 0.3873818 1.2136763 -1.0038819 med_low
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.1901952 1.0417863 -1.0097984 -0.6373311 0.1706618 1.2676838 0.4406159 0.8131756 -0.5145975 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.1574604 0.8890270 -1.0367727 -0.6373311 0.1706618 1.2676838 0.3440059 1.6113762 -0.9277710 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -1.2547863 1.1163897 -1.1746359 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9838699 -0.9930089 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.1622751 1.1163897 -1.1318000 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9278558 -0.7538032 high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.9664114 1.0382338 -1.1630958 -0.5224844 -0.0310742 -1.7347012 0.4406159 2.3297568 -1.1669767 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.2200834 1.1163897 -1.1283332 -0.5224844 -0.0310742 -1.7347012 -2.0128627 2.1211044 -0.9495170 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.9345550 1.1163897 -1.0820306 -0.5224844 -0.0310742 -1.7347012 -2.0527336 0.5597119 -0.7538032 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.9336767 0.9636304 -1.1085299 -0.5224844 -0.0310742 -1.7347012 0.3837671 2.3633653 -0.8625331 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.9786758 0.9352101 -1.0777090 -0.5224844 -0.0310742 -1.7347012 -0.0528401 1.2318808 -0.7755492 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.2533631 1.1163897 -1.0464131 -0.5224844 -0.0310742 -1.7347012 -0.1651137 0.0877932 -0.3188837 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -1.8112772 0.6900847 -1.0375800 -0.5224844 -0.0310742 -1.7347012 -0.1467118 -0.0746476 -0.7864221 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.8192718 1.0631016 -1.0314063 -0.5224844 -0.0310742 -1.7347012 -1.0375614 0.4392816 -0.3406296 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.4412321 0.9032372 -0.9776477 -0.5224844 -0.0310742 -1.7347012 -2.9360253 0.4882939 -1.0256279 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.9370190 1.0240236 -0.9107344 -0.5224844 -0.0310742 -1.7347012 0.0740016 -1.1291127 2.0405547 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.3111714 1.1163897 -0.9677223 -0.5224844 -0.0310742 -1.7347012 -0.0304949 -0.8714479 0.1921468 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 0.3207517 1.1163897 -0.9636382 -0.5224844 -0.0310742 -1.7347012 0.0836407 -0.7370141 0.0834169 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 1.7141136 0.7895559 -0.8662839 -0.5224844 -0.0310742 -1.7347012 0.1944903 -1.5296134 2.9865046 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 2.9751133 0.8996847 -0.7755306 -0.5224844 -0.0310742 -1.7347012 0.3480587 -1.3069574 2.9865046 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 2.3403437 0.9813931 -0.8306664 -0.5224844 -0.0310742 -1.7347012 0.1382988 -1.2537440 2.9865046 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5801657 0.3774611 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -1.4137053 -0.0718469 0.1377818 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.0489109 0.9778406 -0.8049744 -0.5224844 -0.0310742 -1.7347012 -0.6526548 -0.2174835 0.1377818 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.1670406 0.9458677 -0.7278033 -0.5224844 -0.0310742 -1.7347012 -0.2917364 -0.1866758 -0.0253130 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486 med_high
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -1.0142570 0.7078474 -0.5693769 -0.5224844 -0.6659492 -0.8570810 0.4406159 0.2852429 0.0616709 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.6057842 0.0044442 -0.5191326 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.4219349 0.0073060 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.3719887 -1.2602606 -0.3147360 -0.5224844 -0.6659492 -0.8570810 0.3755520 -1.0254866 0.7466691 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.3766409 -0.7593522 -0.1140436 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.3561184 0.0725439 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.0432179 0.1714136 -0.2267846 -0.5224844 -0.6659492 -0.8570810 0.4263763 -0.8910529 0.2247657 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 2.1069307 0.5231153 -0.5005640 -0.7521778 -1.2770905 -0.3027945 0.4259382 -0.7132081 1.8774599 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 1.2387480 0.8392915 -0.5197499 -0.7521778 -1.2770905 -0.3027945 0.4101651 -1.0969046 1.6708731 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 0.3961839 0.9600779 -0.4502247 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.9764743 1.0837317 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.9687130 0.7540305 -0.3833114 -0.7521778 -1.2770905 -0.3027945 0.3759901 0.1858179 0.4204795 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 2.2008652 -0.5319896 -0.2829652 -0.7521778 -1.2770905 -0.3027945 0.3938444 -1.1487176 2.9865046 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.3862212 -1.4023623 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.2866094 -1.1333138 0.7901611 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2814456 -1.0542132 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.4406159 -1.0170845 1.3446835 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.9484050 -1.6723554 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.2300893 -1.0576947 1.5730162 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2714828 -1.5018334 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3704038 -1.3699732 1.5077783 med_low
2.0853880 -1.1962619 -0.2723291 -1.3263561 0.7334942 -2.0844502 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.4019500 -1.0674972 0.9315099 low
2.0853880 -1.1962619 -0.2723291 -1.3263561 0.4545372 -1.7682741 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.2193548 -1.1585201 0.7140502 low
2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4266171 -1.2247352 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.4406159 -1.2005307 1.1707156 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4081148 -1.0755284 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.3891344 -0.8448412 1.3120645 low
3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835 low
3.5860878 -1.4090789 -0.2723291 -1.3090965 1.2102830 -1.9423486 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.3026016 -1.1487176 1.1272237 low
3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008 low
3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845 low
3.5860878 -1.2327031 -0.2723291 -1.1960462 2.2321767 -1.2567081 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3954874 -1.2383402 2.8234098 low
3.5860878 -1.2327031 -0.2723291 -1.1960462 2.4897850 -1.3028911 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3710611 -1.3685729 2.9865046 low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.7139512 0.1465458 0.2658758 -0.6373311 -0.7786840 0.0667298 0.3587931 0.7571615 -0.0035670 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.3140179 -0.3365998 0.2109299 -0.6373311 -0.7786840 0.0667298 0.2699601 0.2810418 0.2030197 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -1.3387581 1.1163897 0.0379717 -0.6373311 -0.7786840 0.0667298 0.4406159 1.4615386 -0.2753917 med_high
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.4620360 0.8357389 0.0389689 -0.6373311 -0.7786840 0.0667298 0.4006356 0.6465337 -0.0905509 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026 med_high
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -1.2419771 -2.0880028 -0.0985619 -0.6373311 -0.7786840 0.0667298 -0.0848244 2.3661660 0.1269088 med_high
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577 med_low
-0.4872402 0.4013236 3.6647712 -0.0405174 -0.5645100 -0.4467286 -0.3243289 -0.5224844 -0.7846174 -0.9494620 0.3957065 0.1200013 0.0834169 low
-0.4872402 0.4013236 -0.2723291 -0.0405174 0.5086207 0.5870610 -0.1775851 -0.5224844 -0.7846174 -0.9494620 0.3954874 -0.4149332 0.6705582 low
-0.4872402 0.4013236 3.6647712 -0.0405174 -0.4748452 0.8961321 -0.4301365 -0.5224844 -0.7846174 -0.9494620 0.4406159 0.7375566 -0.1122969 med_low
-0.4872402 0.4013236 3.6647712 -0.0405174 0.1257664 0.8463965 -0.2050342 -0.5224844 -0.7846174 -0.9494620 0.4060028 -0.3015046 0.0507979 med_low
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.9484050 0.7078474 -0.4432437 -0.1779443 -0.6006817 -0.4875567 0.3836576 -0.4121325 0.4530985 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 0.4744627 0.4343017 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.4406159 -0.7076067 0.8227800 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 2.8199790 0.3454882 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.3108167 -1.1921285 2.4211092 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 3.4732509 0.5124576 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.2774085 -1.1235113 2.9865046 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 2.4983245 0.6367966 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.3363384 -1.3335641 1.6382541 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.2501340 0.4023288 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.1687496 -0.8812504 0.9858749 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.9944939 -1.8322198 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2282272 -1.2229363 2.6276960 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 0.3805282 -1.6759080 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2592256 -1.2453419 0.9750019 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 -0.4321477 -0.0168711 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2374281 -0.1404642 0.1921468 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.6281737 -0.0737117 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.0386218 -0.6445909 0.7031772 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 -0.2827064 -0.2513388 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2199025 -0.2482913 0.1595278 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.4929649 0.2815424 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3480587 -0.4359384 0.2791307 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 1.5276678 0.1074679 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3658034 -1.1095078 0.9750019 med_high
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.2794774 -1.7789317 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2490389 -0.8812504 0.1269088 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.8715495 -0.5071218 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.3787285 -0.1782737 -0.0579320 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 -0.2698972 -0.1234473 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.4156419 -0.0354378 -0.2645187 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1044176 -0.5568574 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1760884 -0.2006793 -0.0361860 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1542314 -2.1590536 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1975573 -1.0450916 0.1269088 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9672898 0.0577323 1.9755128 -0.2927910 -0.4642132 0.2976825 0.3555071 0.8131756 -0.4384865 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.2513949 -1.1963149 2.0232877 -0.2927910 -0.4642132 0.2976825 0.3670082 -0.4891518 0.1921468 med_high
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.0834514 0.3774611 2.0232877 -0.2927910 -0.4642132 0.2976825 0.2132208 -0.3505170 -0.2210268 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.6167877 -1.8144571 1.9145357 -0.2927910 -0.4642132 0.2976825 0.4060028 -0.8532433 0.3987335 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2182776 -2.1199757 1.7104241 -0.2927910 -0.4642132 0.2976825 0.2234076 -1.2691479 0.2465117 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.9569445 -2.1945790 2.4275218 -0.2927910 -0.4642132 0.2976825 0.3222084 -1.2775500 0.7684151 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 2.8100163 -2.1377384 2.4275218 -0.2927910 -0.4642132 0.2976825 0.4406159 -1.2761497 2.2036495 med_high
2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.2513949 -1.2993386 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.3966923 -0.8518430 -0.0688050 low
2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348 low
3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253 low
0.3703025 -1.0446662 -0.2723291 0.7965722 3.4433626 0.6510068 -0.9469692 -0.5224844 -0.8558183 -2.5199404 0.3617506 -1.0548940 2.9865046 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.4920866 1.1163897 -0.9025187 -0.5224844 -0.8558183 -2.5199404 0.2915385 -0.6810000 1.4642863 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.3070641 0.4698271 -0.7992281 -0.5224844 -0.8558183 -2.5199404 0.3957065 -0.4289367 1.2250806 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 3.0078481 0.8144237 -0.7154559 -0.5224844 -0.8558183 -2.5199404 0.3306426 -0.9442662 2.8560287 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.4835471 0.9209999 -0.8150422 -0.5224844 -0.8558183 -2.5199404 0.4024977 -0.1964782 0.9206369 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.3113338 0.8179762 -0.8856597 -0.5224844 -0.8558183 -2.5199404 0.3419247 -0.6375891 1.5186513 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 -1.0313360 -0.2051558 -0.8588754 -0.5224844 -0.8558183 -2.5199404 0.3913251 -0.3085064 0.0290519 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.0380698 0.5692983 -0.7893502 -0.5224844 -0.8558183 -2.5199404 0.3000823 0.2992464 0.8880180 med_high
0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046 med_high
0.3703025 -1.0446662 -0.2723291 0.1752274 1.6870719 -0.5675150 -0.4383522 -0.5224844 -0.8558183 -2.5199404 0.3683227 -1.3293630 2.2797604 med_high
0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808 med_low
0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889 med_high
0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.0635259 -1.8570876 0.3010658 -0.7521778 -1.0990882 0.0667298 0.4406159 -0.8490423 0.2900036 med_low
0.3703025 -0.6088285 -0.2723291 -0.7826793 0.3606027 -0.3508100 0.0581549 -0.7521778 -1.0990882 0.0667298 0.4193661 -0.6894022 0.2030197 med_low
0.3703025 -0.6088285 3.6647712 -0.7826793 2.0016102 -0.5959353 0.2713846 -0.7521778 -1.0990882 0.0667298 0.3734708 -0.8504426 1.3773024 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 0.6737177 -1.2673657 0.1341862 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.2775500 1.0728588 low
1.2278453 -0.6889993 -0.2723291 -0.9293857 0.8103497 -0.9156641 0.2242746 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.3545694 1.0293668 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 1.3981521 -0.6954065 0.4711747 -0.6373311 -0.9151524 -0.3951756 0.3568215 -0.9246613 1.1598427 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 0.7704987 -1.4556504 0.5070771 -0.6373311 -0.9151524 -0.3951756 0.4028263 -1.1893278 1.1489697 low
1.2278453 -0.6889993 -0.2723291 -0.9293857 0.2809007 -1.2957860 0.1639624 -0.6373311 -0.9151524 -0.3951756 0.4406159 -0.7650212 0.7140502 low
0.3703025 -1.1379558 -0.2723291 -0.9647679 0.7505732 -1.2922335 0.1451564 -0.5224844 -1.1406221 -1.6423201 0.4406159 -1.0927035 1.3664294 med_low
0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472 low
0.3703025 -1.1379558 -0.2723291 -0.9647679 0.9726003 -1.1146064 0.6884411 -0.5224844 -1.1406221 -1.6423201 0.3894630 -1.1291127 1.3990484 low
0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851 low
3.3717021 -1.4469778 3.6647712 -1.3263561 2.3318042 -1.5551216 0.9925190 -0.9818712 -1.2474235 -2.2427971 0.4255000 -1.3293630 2.9865046 low
3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128 low
1.8710023 -1.2953821 -0.2723291 -1.4299136 0.2396264 -1.3028911 1.6679681 -0.9818712 -0.6422155 -1.4575580 0.4167372 -0.6193846 -0.0579320 low
2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 -0.1076467 -1.3242064 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7720229 0.0725439 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 0.0432179 -0.8161929 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7076067 -0.0253130 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 0.8203125 -1.4449928 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.3055571 0.6488122 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 0.4915417 -1.6048571 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.1137088 0.5835743 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.2243532 -1.7824842 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.5703722 0.1486548 med_low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.3922967 -0.9334268 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.3155082 -0.0905509 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852 med_low
-0.4872402 0.4056965 -0.2723291 -1.0156836 0.3762584 -0.6243557 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.3962541 -0.7370141 0.4965904 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728 med_low
2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670 low
2.5141594 -1.2968398 -0.2723291 -1.3349859 1.0764975 -2.0808977 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1632728 -1.1081074 0.7031772 low
2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117 low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320 low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.2994029 -1.7824842 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.2950436 -0.5577691 0.4204795 med_low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.9925258 -1.8073520 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.3697466 -1.0913032 1.1489697 med_low
0.9277053 -1.3055857 -0.2723291 -0.7136410 1.3540313 -0.9760573 0.1077818 -0.2927910 -1.1050216 -0.0256513 0.4053456 -0.8014303 1.4751593 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 0.4716162 -0.3721252 -0.2018524 -0.2927910 -1.1050216 -0.0256513 0.4018404 -0.5213599 0.6379392 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 1.6159094 0.1181255 -0.3304551 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.8658465 1.1815886 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 0.8032335 0.0612849 -0.2908010 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.7174092 0.6161933 low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -1.8667839 -1.0932912 -0.6058017 -0.6373311 -0.6184819 -0.0256513 -0.0681750 -0.0018293 -0.6994382 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2314694 -0.5604099 -0.5483863 -0.6373311 -0.6184819 -0.0256513 0.4406159 -0.9344638 -0.0470590 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.8249648 0.3241729 0.0712146 -0.6373311 -0.6184819 -0.0256513 0.4353582 -0.1614694 -0.6885653 med_low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.5275055 0.5195627 0.0966692 -0.6373311 -0.6184819 -0.0256513 0.3727041 0.7949710 -0.5145975 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.7153745 0.1110204 0.1123884 -0.6373311 -0.6184819 -0.0256513 0.4406159 0.4602869 -0.2971377 med_low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2011986 -0.5781726 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.7636208 0.1377818 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1300361 -0.5071218 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.8098324 0.0616709 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.3467527 -0.6634336 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.6936032 -0.2318998 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.8206950 0.2033865 0.4397839 -0.5224844 -0.7193499 0.5286352 0.3774141 -0.1278610 -0.4384865 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1855429 -1.0115827 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9148588 0.2682577 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2083149 -1.9139283 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4053456 -1.0604955 0.2247657 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.0389481 -1.4094674 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9106578 0.0507979 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860 med_low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026 low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 0.0688364 -1.8251147 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2030341 -0.7440159 0.0073060 low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.2001579 -1.2922335 0.9871052 -0.6373311 0.1291279 -0.7185093 0.1303027 -0.4989543 -0.2971377 low
1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.8235415 -1.4272301 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.4090698 -0.0312367 -0.5907084 low
1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.3609852 -1.6084097 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.0610765 -0.6753986 -0.3406296 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0446411 -1.0826335 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3618601 -0.9764743 -0.0361860 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0361016 -1.0684234 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3584645 -0.8266367 -0.1992808 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3524457 -1.2105250 1.0401514 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.6501923 -0.1557889 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3211342 -1.1110539 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4322912 -0.5801747 -0.2101538 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406 low
1.0134596 -1.4017907 -0.2723291 -0.9725347 1.3611476 -0.6847489 1.5400303 -0.9818712 -0.7371501 -1.3651769 0.4169563 -1.0030810 1.1054777 low
-0.4872402 -1.3478576 -0.2723291 -0.3166707 0.3634492 -0.3152846 1.1738830 -0.9818712 0.0816606 -1.1804147 0.3645985 -0.5605698 -0.6559463 low
1.8710023 -1.0723615 -0.2723291 -0.6100835 0.5854762 -0.4325184 0.9199069 -0.5224844 -0.2268768 -0.3951756 0.4406159 -0.7664215 0.1486548 low
1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829 low
-0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.3851804 -0.7131692 2.0033894 -0.7521778 -0.3336782 0.1591109 0.3172793 -0.2973036 -0.5472164 low
-0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354 low
3.1573164 -1.0184285 -0.2723291 -1.0847220 0.3292912 -1.4520979 2.2511443 -0.6373311 -0.3396116 -0.2566040 0.3916537 -0.8812504 0.0616709 low
2.9429307 -1.3303658 -0.2723291 -1.0329432 0.4986579 -1.3810470 2.1602961 -0.6373311 -0.7608838 -0.6723188 0.3753329 -0.9330634 0.2138927 low
1.2278453 -1.4411472 -0.2723291 -1.0847220 0.9313260 -1.2105250 2.3730984 -0.9818712 -0.4345461 0.5748257 0.3633936 -0.9470669 0.4422255 low
1.2278453 -1.4411472 -0.2723291 -1.0847220 0.2922867 -0.8588234 2.3730984 -0.9818712 -0.4345461 0.5748257 0.4406159 -0.9344638 0.0399249 low
2.0853880 -1.3770106 -0.2723291 -1.2400582 0.4189559 -1.1607894 3.2840500 -0.6373311 0.0163931 -0.0718418 0.1545100 -1.0030810 0.1704008 low
2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135 low
3.3717021 -1.3289081 -0.2723291 -1.2486880 0.6310202 -1.1536844 3.9566022 -0.5224844 -1.3126910 -0.6723188 0.3043541 -1.1417159 0.8227800 low
2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.8847413 -1.6581453 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2861713 -0.6445909 -0.4711055 low
2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.4961940 -1.7434063 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2121255 -0.9918782 -0.2101538 med_low
-0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975 high
-0.4872402 1.0149946 3.6647712 1.8580364 0.1570779 0.7966610 -0.6125452 1.6596029 1.5294129 0.8057784 0.3797143 0.0863929 -0.0905509 high
-0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060 high
-0.4872402 1.0149946 -0.2723291 1.8580364 0.1613476 0.6900847 -0.6063715 1.6596029 1.5294129 0.8057784 0.1959143 -0.6810000 0.2682577 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -0.0478701 0.8002135 -0.7121316 1.6596029 1.5294129 0.8057784 -0.0659843 0.2152253 -0.2862647 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -1.3131396 0.9813931 -0.8032647 1.6596029 1.5294129 0.8057784 0.2641547 -0.3449156 -0.1884078 high
-0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273 high
-0.4872402 1.0149946 3.6647712 1.4092873 3.5515296 0.5089051 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0232656 -1.0310881 -0.0688050 med_high
-0.4872402 1.0149946 -0.2723291 1.4092873 -3.8764132 0.6865322 -1.0361553 1.6596029 1.5294129 0.8057784 -0.0216226 -0.7748236 0.5400824 high
-0.4872402 1.0149946 -0.2723291 1.4092873 -1.8810164 0.8108711 -0.9700968 1.6596029 1.5294129 0.8057784 -0.4451952 0.1886186 -0.0688050 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -3.4465917 1.1163897 -1.0848799 1.6596029 1.5294129 0.8057784 -2.4673242 0.0947950 0.0616709 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -1.8710537 1.1163897 -1.1694595 1.6596029 1.5294129 0.8057784 0.2064297 -1.3153595 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.6584956 1.0409163 1.0275762 -1.2312439 1.6596029 1.5294129 0.8057784 0.3874913 -1.3573701 2.9865046 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -0.0976839 1.1163897 -1.2470580 1.6596029 1.5294129 0.8057784 0.1037952 -0.4373388 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046 high
-0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 1.0036872 1.4636216 1.0417863 -1.1771529 1.6596029 1.5294129 0.8057784 0.4406159 0.1101988 -0.8190411 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.5185834 0.8783694 -1.1635707 1.6596029 1.5294129 0.8057784 0.0695107 1.4825438 -0.9386440 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.1357291 0.9813931 -1.1440049 1.6596029 1.5294129 0.8057784 0.4406159 1.5455597 -1.0256279 high
-0.4872402 1.0149946 -0.2723291 1.0036872 -0.0877212 1.1163897 -1.1440049 1.6596029 1.5294129 0.8057784 0.4060028 1.2780924 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.3705654 1.0844168 -1.0807958 1.6596029 1.5294129 0.8057784 0.4406159 1.1800678 -1.2648336 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.0882661 1.1163897 -1.0741947 1.6596029 1.5294129 0.8057784 0.4406159 1.6673903 -1.1126118 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -2.7278503 0.8037660 -1.1186453 1.6596029 1.5294129 0.8057784 -0.7759914 2.5174040 -1.4931663 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.4341159 1.0488914 -1.1250089 1.6596029 1.5294129 0.8057784 0.4406159 2.5426103 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.8283562 0.7433728 -1.0811757 1.6596029 1.5294129 0.8057784 0.4406159 2.7078519 -1.6453882 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.9991462 1.1163897 -1.0474104 1.6596029 1.5294129 0.8057784 0.1779505 2.5160036 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.2732886 1.0773117 -0.9815894 1.6596029 1.5294129 0.8057784 0.4406159 1.1478597 -1.1995957 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -0.8135788 1.0098134 -0.8873694 1.6596029 1.5294129 0.8057784 0.4135607 0.6241280 -0.8081681 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -0.3325202 0.4946949 -0.7727762 1.6596029 1.5294129 0.8057784 0.2377567 0.8551861 0.0725439 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.1304187 0.8535016 -0.9516232 1.6596029 1.5294129 0.8057784 0.4406159 0.3524598 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.5659332 0.9281050 -0.9559448 1.6596029 1.5294129 0.8057784 0.4406159 0.5177013 -1.0691198 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.1713104 0.9742880 -1.0059517 1.6596029 1.5294129 0.8057784 0.4406159 0.9406076 -1.0908658 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.1836238 1.1163897 -1.0948528 1.6596029 1.5294129 0.8057784 0.4406159 2.5118026 -1.9063399 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.6157470 0.3277255 -1.0897239 1.6596029 1.5294129 0.8057784 -0.2027938 2.4249808 -1.7649910 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.4236082 1.1163897 -1.0477428 1.6596029 1.5294129 0.8057784 0.4406159 1.9768681 -1.8411020 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.0830689 1.1163897 -1.0547238 1.6596029 1.5294129 0.8057784 0.4406159 1.0736410 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.0726103 0.5977186 -1.0389097 1.6596029 1.5294129 0.8057784 -0.2980894 2.0622896 -1.5257853 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399 high
-0.4872402 1.0149946 -0.2723291 0.9001297 -3.0551979 1.1163897 -1.2427839 1.6596029 1.5294129 0.8057784 0.1483760 1.4965474 -1.1561037 high
-0.4872402 1.0149946 -0.2723291 0.9001297 -0.9630200 1.1163897 -1.1919222 1.6596029 1.5294129 0.8057784 -0.2692816 -0.0732473 0.5835743 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354 high
-0.4872402 1.0149946 -0.2723291 0.3650828 0.8075032 1.1163897 -1.1062979 1.6596029 1.5294129 0.8057784 -1.9422126 0.9980220 0.5400824 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -0.7509558 1.1163897 -1.1312301 1.6596029 1.5294129 0.8057784 -3.8783565 -0.3561184 -0.8190411 high
-0.4872402 1.0149946 -0.2723291 0.3650828 0.5299694 1.1163897 -1.0768541 1.6596029 1.5294129 0.8057784 -3.5229148 1.1996727 -0.5798354 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -2.5129395 1.1163897 -1.0147848 1.6596029 1.5294129 0.8057784 -2.9399686 3.4066275 -1.6888801 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.2125846 1.1163897 -0.9309651 1.6596029 1.5294129 0.8057784 -3.6083523 2.2961484 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -1.3956881 0.7291627 -1.0198662 1.6596029 1.5294129 0.8057784 -2.5117955 1.9586635 -1.3191985 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663 high
-0.4872402 1.0149946 -0.2723291 1.4092873 0.7676522 0.2815424 -0.9502935 1.6596029 1.5294129 0.8057784 -3.3761377 1.4125262 -1.5366583 high
-0.4872402 1.0149946 -0.2723291 1.4092873 0.1798499 1.1163897 -0.9194726 1.6596029 1.5294129 0.8057784 -0.4154016 0.3314545 -0.6342003 high
-0.4872402 1.0149946 -0.2723291 1.4092873 -0.3965665 0.9494202 -0.9120166 1.6596029 1.5294129 0.8057784 -0.4019288 0.4266784 -0.9060250 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.2585112 0.5870610 -0.8421115 1.6596029 1.5294129 0.8057784 -3.8792328 1.4895456 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.1176094 0.3596983 -0.9175730 1.6596029 1.5294129 0.8057784 -3.7006904 0.2614369 -1.2648336 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.1304187 0.3383831 -0.8830478 1.6596029 1.5294129 0.8057784 -2.8473018 1.2416833 -1.2539606 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.1357291 0.9600779 -0.8675661 1.6596029 1.5294129 0.8057784 -3.2417380 1.6001734 -1.4170554 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.0901851 0.6225864 -0.8274371 1.6596029 1.5294129 0.8057784 -2.9927645 0.6983467 -0.8734060 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.7804615 0.9138948 -0.8105782 1.6596029 1.5294129 0.8057784 -3.0159860 0.9854189 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.1997754 0.2211492 -0.7572945 1.6596029 1.5294129 0.8057784 -2.8339385 -0.0872508 -0.6994382 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2510124 0.8783694 -0.8512296 1.6596029 1.5294129 0.8057784 -3.6057234 0.7557611 -1.4061824 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.1887719 1.1163897 -0.8932106 1.6596029 1.5294129 0.8057784 -3.8047489 1.9320568 -1.5040393 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.4976172 0.6865322 -0.9376612 1.6596029 1.5294129 0.8057784 -3.1515905 2.9921233 -1.5366583 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.9359783 0.8996847 -0.9392759 1.6596029 1.5294129 0.8057784 0.4406159 1.4321312 -1.0582468 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.6641375 0.8463965 -0.9160058 1.6596029 1.5294129 0.8057784 0.3809192 1.3243041 -1.3083256 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.1727336 1.0169185 -0.8215484 1.6596029 1.5294129 0.8057784 0.3207844 0.9616129 -0.5907084 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.0934142 1.1163897 -0.8501848 1.6596029 1.5294129 0.8057784 0.4273621 0.5513097 -0.4493595 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2851704 1.1163897 -0.8627221 1.6596029 1.5294129 0.8057784 0.3292186 0.8677893 -0.7755492 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.0802224 0.9884982 -0.8182715 1.6596029 1.5294129 0.8057784 -0.4235072 0.7193520 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.0478701 0.9956033 -0.7584343 1.6596029 1.5294129 0.8057784 0.3488254 0.5303045 -1.0799928 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.1883894 1.0559965 -0.7646079 1.6596029 1.5294129 0.8057784 -0.5746657 0.9322055 -1.0365009 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.6609085 0.8535016 -0.6987869 1.6596029 1.5294129 0.8057784 -3.9033305 0.6703397 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382 high
-0.4872402 1.0149946 -0.2723291 1.3661384 1.5774816 1.0915219 -0.6374774 1.6596029 1.5294129 0.8057784 0.2102634 0.5723150 -0.5145975 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.6310202 0.9067897 -0.6168668 1.6596029 1.5294129 0.8057784 -3.8336662 0.8481844 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0232924 0.5373254 -0.4805707 1.6596029 1.5294129 0.8057784 -0.9251783 0.5008971 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.5925924 0.7611355 -0.5687120 1.6596029 1.5294129 0.8057784 -1.1111691 0.5275038 -0.6668193 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.1300361 0.7042949 -0.5831490 1.6596029 1.5294129 0.8057784 0.3807001 0.2796414 -0.5254704 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.3250214 0.7575830 -0.4717851 1.6596029 1.5294129 0.8057784 0.4068791 -0.3309120 -0.2536457 high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.1076467 -0.1127897 -0.3949464 1.6596029 1.5294129 0.8057784 0.4406159 0.0793911 -0.1231699 high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.7481093 -0.7238268 -0.3459843 1.6596029 1.5294129 0.8057784 -0.2439790 0.2068231 -0.2862647 med_high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.4734220 0.5728508 -0.4385897 1.6596029 1.5294129 0.8057784 -3.6657487 0.6297295 -0.3841216 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.8135788 -0.4218608 -0.4612898 1.6596029 1.5294129 0.8057784 0.4406159 0.2950453 -0.2645187 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0791817 0.7860033 -0.3304076 1.6596029 1.5294129 0.8057784 0.4234189 0.0303788 -0.3188837 high
-0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439 med_high
-0.4872402 1.0149946 -0.2723291 0.5117892 0.9896793 -0.0346338 -0.5993905 1.6596029 1.5294129 0.8057784 0.1972287 -0.1390638 0.7901611 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -1.2206283 0.9529728 -0.6483526 1.6596029 1.5294129 0.8057784 -0.0448441 0.7683643 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.1745394 1.0240236 -0.7546351 1.6596029 1.5294129 0.8057784 -0.5905484 1.6029741 -1.0038819 high
-0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.1418047 0.9991558 -0.7714940 1.6596029 1.5294129 0.8057784 0.2522154 0.7529604 -0.8625331 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.0791817 0.6900847 -0.8756394 1.6596029 1.5294129 0.8057784 0.2918671 0.0639872 -0.1231699 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0606794 -0.1376575 -0.1761129 1.6596029 1.5294129 0.8057784 0.4406159 -0.2678962 0.0507979 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 0.6623317 0.2247018 -0.2200411 1.6596029 1.5294129 0.8057784 0.3986639 -0.6880018 0.1269088 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.7438395 -1.0044776 0.1440166 1.6596029 1.5294129 0.8057784 0.3970209 -0.3127075 -0.0796779 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 0.0389481 -0.5923828 0.0933924 1.6596029 1.5294129 0.8057784 0.3499208 -0.2903018 -0.1449159 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.5403147 -0.5461998 -0.3052380 1.6596029 1.5294129 0.8057784 0.3455394 -0.1684712 -0.2101538 high
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 1.0737592 -0.9151035 -0.6373311 1.7964164 0.7595879 0.3662415 0.7585618 -0.9712629 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8221183 -0.5177794 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 -0.0900515 -0.0796779 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8747785 -1.4130199 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4010737 0.6927453 0.0616709 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -1.2732886 0.1536509 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4406159 1.1884699 -0.3080107 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -1.0185268 0.1749662 -0.6625521 -0.4076377 -0.1022751 0.3438730 0.4282384 0.3426573 -0.5472164 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3666782 0.3952238 -0.6158695 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2348302 -0.6233273 med_low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400 low
-0.4872402 0.1156240 -0.2723291 0.1579678 -0.2343159 0.2886475 -0.7159308 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.5003546 -0.2101538 low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548 low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.7249547 0.7362677 -0.6677760 -0.9818712 -0.8024176 1.1753027 0.4028263 -0.8644462 -0.0579320 med_low
-0.4872402 0.1156240 -0.2723291 0.1579678 -0.3624084 0.4343017 -0.6126402 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.6683969 -1.1561037 low

We checked the scaled crime rate values and defined the quantiles of the crime data, which where then named as low (0-25%), medium low (25-50%), medium high(50-75%) and high (75-100%).Then the crime rate quantiles definintions were added instead of the crim values as “crime” variable into the ‘boston_scaled’ data set.

Quantiles:
0% - 25% -50% - 75% -100% of the data distribution –> quantiles give the values how many values are in which part of the data distribution. The median is the center of the quantiles (were the upper and lower 50 % quantiles meet).


1.3.3. Define training and test data set

We are going to use 80% of the observations (the rows) to train the linear discriminant analysis model.
Therefore we prepare a training data set ‘boston_train’ including 80% of the rows and including the “crime” variable.
The test data set ‘boston_test’ will include 20% of the rows and we remove the “crime” variable. With the linear discriminant model we later on predict the crime rate of the test data set.

# number of rows in the Boston dataset 
n_boston <- nrow(boston_scaled)

# choose randomly 80% of the rows
ind_boston <- sample(n_boston,  size = n_boston * 0.8)

# create train set (crime variable stays in the data set)
boston_train <- boston_scaled[ind_boston,]

# create test set 
boston_test <- boston_scaled[-ind_boston,]

knitr::kable(boston_test, caption = "Test data set with the crime variable") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Test data set with the crime variable
zn indus chas nox rm age dis rad tax ptratio black lstat medv crime
7 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249 med_low
15 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055 med_high
23 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951 med_high
24 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060 med_high
31 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198 med_high
35 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359 med_high
43 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766 med_low
47 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917 med_low
53 0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577 low
62 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112 med_low
63 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860 med_low
66 2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628 low
68 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320 low
71 -0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738 med_low
78 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078 med_low
80 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728 med_low
81 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473 low
87 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670 low
91 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060 low
97 -0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699 med_low
100 -0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427 low
101 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824 med_low
108 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998 med_low
115 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865 med_low
116 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325 med_low
119 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998 med_low
120 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026 med_low
123 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268 med_low
124 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624 med_low
132 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837 med_high
142 -0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790 med_high
151 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969 med_high
156 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032 med_high
163 -0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046 med_high
165 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789 med_high
172 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486 med_high
186 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151 low
196 2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046 low
200 3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835 low
202 3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008 low
203 3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845 low
214 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203 med_low
222 -0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509 med_high
234 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638 med_high
251 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197 med_low
260 0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800 med_high
281 0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472 low
283 0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851 low
290 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117 low
296 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852 med_low
298 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728 med_low
299 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670 low
301 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117 low
310 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728 med_high
313 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296 med_high
314 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239 med_high
319 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709 med_high
320 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618 med_high
328 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860 med_low
338 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865 low
345 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829 low
353 2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135 low
357 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975 high
364 -0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273 high
370 -0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046 high
375 -0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170 high
381 -0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985 high
383 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417 high
387 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256 high
396 -0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279 high
398 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853 high
404 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313 high
406 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399 high
409 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354 high
413 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245 high
414 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923 high
417 -0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152 high
419 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663 high
423 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078 high
425 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497 high
427 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445 high
445 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066 high
446 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767 high
449 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980 high
453 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382 high
456 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980 high
457 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198 high
460 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917 high
463 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567 high
469 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486 high
471 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647 high
473 -0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439 med_high
477 -0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003 high
483 -0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577 high
485 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538 med_high
487 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486 high
493 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187 med_low
495 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927 med_high
498 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325 med_high
499 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159 med_low
502 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400 low
504 -0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548 low
# save the correct classes from test data
correct_classes <- boston_test$crime
correct_classes
##   [1] med_low  med_high med_high med_high med_high med_high med_low 
##   [8] med_low  low      med_low  med_low  low      low      med_low 
##  [15] med_low  med_low  low      low      low      med_low  low     
##  [22] med_low  med_low  med_low  med_low  med_low  med_low  med_low 
##  [29] med_low  med_high med_high med_high med_high med_high med_high
##  [36] med_high low      low      low      low      low      med_low 
##  [43] med_high med_high med_low  med_high low      low      low     
##  [50] med_low  med_low  low      low      med_high med_high med_high
##  [57] med_high med_high med_low  low      low      low      high    
##  [64] high     high     high     high     high     high     high    
##  [71] high     high     high     high     high     high     high    
##  [78] high     high     high     high     high     high     high    
##  [85] high     high     high     high     high     high     high    
##  [92] med_high high     high     med_high high     med_low  med_high
##  [99] med_high med_low  low      low     
## Levels: low med_low med_high high
# remove the crime variable from test data set / so the prediction can be done later!
boston_test <- dplyr::select(boston_test, -crime)
knitr::kable(boston_test, caption = "Test data without the crime variable") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Test data without the crime variable
zn indus chas nox rm age dis rad tax ptratio black lstat medv
7 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249
15 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055
23 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951
24 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060
31 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198
35 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359
43 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766
47 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917
53 0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577
62 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112
63 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860
66 2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628
68 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320
71 -0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738
78 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078
80 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728
81 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473
87 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670
91 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060
97 -0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699
100 -0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427
101 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824
108 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998
115 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865
116 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325
119 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998
120 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026
123 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268
124 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624
132 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837
142 -0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790
151 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969
156 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032
163 -0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046
165 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789
172 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486
186 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151
196 2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046
200 3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835
202 3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008
203 3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845
214 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203
222 -0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509
234 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638
251 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197
260 0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800
281 0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472
283 0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851
290 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117
296 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852
298 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728
299 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670
301 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117
310 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728
313 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296
314 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239
319 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709
320 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618
328 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860
338 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865
345 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829
353 2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135
357 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975
364 -0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273
370 -0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046
375 -0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170
381 -0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985
383 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417
387 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256
396 -0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279
398 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853
404 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313
406 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399
409 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354
413 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245
414 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923
417 -0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152
419 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663
423 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078
425 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497
427 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445
445 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066
446 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767
449 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980
453 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382
456 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980
457 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198
460 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917
463 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567
469 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486
471 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647
473 -0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439
477 -0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003
483 -0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577
485 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538
487 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486
493 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187
495 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927
498 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325
499 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159
502 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400
504 -0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548

1.3.4. Linear discriminant analysis of the training data set

Here we run now the LDA ananlysis on the ‘boston_train’ data set. So we make a classification of the “crime” variable of our training data set.

# linear discriminant analysis
lda.fit <- lda(crime ~ ., data = boston_train)

# print the lda.fit object
lda.fit
## Call:
## lda(crime ~ ., data = boston_train)
## 
## Prior probabilities of groups:
##       low   med_low  med_high      high 
## 0.2599010 0.2524752 0.2524752 0.2351485 
## 
## Group means:
##                   zn      indus        chas        nox         rm
## low       0.93076797 -0.8879201 -0.12234430 -0.8684841  0.4119481
## med_low  -0.04039317 -0.3617853  0.03646311 -0.5814952 -0.1099490
## med_high -0.36785679  0.1767310  0.19085920  0.4021401  0.1491244
## high     -0.48724019  1.0172655 -0.10655643  1.0921653 -0.3812253
##                 age        dis        rad        tax     ptratio
## low      -0.8678232  0.8704274 -0.6909262 -0.7547243 -0.41189215
## med_low  -0.3399434  0.3916756 -0.5483812 -0.4945201 -0.08497441
## med_high  0.3862031 -0.3751869 -0.4245269 -0.3304206 -0.33449389
## high      0.8208182 -0.8628321  1.6366336  1.5129868  0.77903654
##                black      lstat        medv
## low       0.37966739 -0.7580994  0.49110214
## med_low   0.32242206 -0.1443083  0.01881854
## med_high  0.07461266 -0.0253058  0.22348654
## high     -0.77682965  0.8703099 -0.66739154
## 
## Coefficients of linear discriminants:
##                 LD1         LD2         LD3
## zn       0.10727113  0.59728301 -0.96971212
## indus    0.01735333 -0.25962098  0.12200687
## chas    -0.08555422 -0.07302459  0.14139314
## nox      0.36315828 -0.89265082 -1.28130821
## rm      -0.10124154 -0.13426602 -0.19512917
## age      0.22743047 -0.25843723 -0.02311861
## dis     -0.04850745 -0.23346089  0.17444445
## rad      3.34125298  0.92219054 -0.18353544
## tax      0.06797628  0.09844992  0.65702430
## ptratio  0.13444899 -0.05118374 -0.26545131
## black   -0.12585456  0.06343315  0.13338021
## lstat    0.27601335 -0.21846852  0.50967572
## medv     0.24525181 -0.46806880 -0.04654628
## 
## Proportion of trace:
##    LD1    LD2    LD3 
## 0.9530 0.0359 0.0111
# the function for lda biplot arrows
lda.arrows <- function(x, myscale = 1, arrow_heads = 0.1, color = "red", tex = 0.75, choices = c(1,2)){
  heads <- coef(x)
  arrows(x0 = 0, y0 = 0, 
         x1 = myscale * heads[,choices[1]], 
         y1 = myscale * heads[,choices[2]], col=color, length = arrow_heads)
  text(myscale * heads[,choices], labels = row.names(heads), 
       cex = tex, col=color, pos=3)
}

# target classes as numeric
classes <- as.numeric(boston_train$crime)

# plot the lda results
plot(lda.fit, dimen = 2, col = classes, pch = classes, main = "LDA biplot of Boston training data")
lda.arrows(lda.fit, myscale = 2)

The linear discriminant analysis shows the classification of the “crime” variable. In the LDA plot we can see the separation of the different crime rates.
The LDA creates a value which characterize which variable discriminate most of the variables. So now we have performed the LDA on the training data set. Now we use the LDA outcome to perform a prediction on the crime rate on the testing data set.


1.3.5. Prediction of crime classes on the test data set

# predict classes with test data
lda.pred <- predict(lda.fit, newdata = boston_test)
lda.pred$class
##   [1] med_low  med_low  med_low  med_low  med_low  med_high low     
##   [8] med_low  low      med_low  med_low  low      med_low  med_low 
##  [15] med_low  med_low  low      med_low  med_low  med_low  med_low 
##  [22] med_high med_low  med_low  med_low  med_low  med_low  med_high
##  [29] med_high med_high med_high med_high med_high med_high med_high
##  [36] med_high med_low  low      low      low      low      med_low 
##  [43] med_low  med_high med_low  med_high med_low  med_low  low     
##  [50] med_low  med_low  low      low      med_low  med_low  med_low 
##  [57] med_low  med_low  med_low  med_low  low      low      high    
##  [64] high     high     high     high     high     high     high    
##  [71] high     high     high     high     high     high     high    
##  [78] high     high     high     high     high     high     high    
##  [85] high     high     high     high     high     high     high    
##  [92] high     high     high     high     high     med_high med_high
##  [99] med_high med_high low      med_high
## Levels: low med_low med_high high
# cross tabulate the results
pred_table <- table(correct = correct_classes, predicted = lda.pred$class)

knitr::kable(pred_table, align = "c",  caption="Pediction of test data" ) %>% kable_styling(bootstrap_options = "striped", full_width = F, position = "center") %>% add_header_above(c(" " = 1, "Predicted crime rate" = 4)) %>%  pack_rows("Correct rate", 1, 4)
Pediction of test data
Predicted crime rate
low med_low med_high high
Correct rate
low 13 8 1 0
med_low 1 18 5 0
med_high 0 10 12 2
high 0 0 0 32

It seems that the prediction is maybe not the best using the linear descriminant analysis.
The prediction showed not so good results on the lower crime rate predictions. Results show that the crime rate was predicted higher than it actually is (see in predicted med_low of and med_high which are actually low rates or med_low rates). In the medium crime level it also wrongly predicted quite a number of values, but on the high crime rates the predictions is quite good with just on value wrongly predicted.


1.3.5. Reload and standardize the Boston dataset and calculate the distances

Now we are going to perform a data clustering. This is different from classifications because in classifications the classes are defined before (in our case the different crime rate values from low to high). Here we analyise the data and through this analysis the clustering will show us were the data is different.

So here we are going to reload the orgininal ‘Boston’ data set and scale the data again to a new data frame ‘boston_scaled2’. Then we calculate the euclidian distances between the obsvervations. Then the k-means are calculated and

#load boston_scaled data set
data(Boston) # load the Boston data set

# scale the Boston data set again - named boston_scaled2
boston_scaled2 <- scale(Boston)
knitr::kable(head(boston_scaled2), caption = "Scaled Boston data set" ) %>% 
              kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
              scroll_box(width = "100%", height = "300px") # the data frame head
Scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
-0.4193669 0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.140075 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278
-0.4169267 -0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.556609 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239
-0.4169290 -0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.556609 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375
-0.4163384 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.076671 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886
-0.4120741 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.076671 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323
-0.4166314 -0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.076671 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582
# euclidean distance matrix
dist_eu <- dist(boston_scaled2)

# look at the summary of the distances
summary(dist_eu)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.1343  3.4625  4.8241  4.9111  6.1863 14.3970
# manhattan distance matrix
dist_man <- dist(boston_scaled2, method = "manhattan")

# look at the summary of the distances
summary(dist_man)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.2662  8.4832 12.6090 13.5488 17.7568 48.8618
# k-means clustering
km_boston <- kmeans(boston_scaled2, centers = 3)

# plot the Boston dataset with clusters
pairs(boston_scaled2, col = km_boston$cluster)

pairs(boston_scaled2[,1:5], col = km_boston$cluster)

pairs(boston_scaled2[,6:10], col = km_boston$cluster)

pairs(boston_scaled2[,11:14], col = km_boston$cluster)

I have performed the k-means clustering with several cluster numbers and think that 3 clusters are the number which allows to get the best overview over the data. The clustering worked good on crim variable, on the nitrogen oxides concentrations and also age, tax and the medv variable. Next we are checking were the k values are decreasing in a plot showing 10 clusters with the total within sum of squares (twcss). On the cluster level where this number decreases heavily that amount of clusters gives a good data overview.

1.3.5. Determine the k

set.seed(123)


# determine the number of clusters
k_max <- 10

# calculate the total within sum of squares
twcss <- sapply(1:k_max, function(k){kmeans(boston_scaled2, k)$tot.withinss})

# visualize the results
qplot(x = 1:k_max, y = twcss, geom = 'line')

# k-means clustering
km_boston <- kmeans(boston_scaled2, centers = 3)

# plot the Boston dataset with clusters
pairs(boston_scaled2, col = km_boston$cluster)

The twcss value decrease at a level of 2 - 3 clusters heavily. So the data I think is optimally clustered with three clusters, seen in the overview plot.

1.3.6. Bonus

  • Perform the LDA using the clusters as target.
  • Include all the variables in the Boston data in the LDA model.
  • Visualize the results with a biplot (include arrows representing the relationships of the original variables to the LDA solution)
data(Boston) # load the Boston data set

# scale the Boston data set again - named boston_scaled2
boston_scaled3 <- scale(Boston)

# k-means clustering
km_boston <-kmeans(boston_scaled3, centers = 3)
cluster <- km_boston$cluster

# add the cluster number to the dataframe
boston_scaled3 <- data.frame(boston_scaled3, cluster)

# linear discriminant analysis of clusters vs. all other variables
lda.fit_cluster <- lda(cluster ~ ., data = boston_scaled3)

# print the lda.fit object
lda.fit_cluster
## Call:
## lda(cluster ~ ., data = boston_scaled3)
## 
## Prior probabilities of groups:
##         1         2         3 
## 0.2470356 0.3260870 0.4268775 
## 
## Group means:
##         crim         zn      indus         chas        nox         rm
## 1 -0.3989700  1.2614609 -0.9791535 -0.020354653 -0.8573235  1.0090468
## 2  0.7982270 -0.4872402  1.1186734  0.014005495  1.1351215 -0.4596725
## 3 -0.3788713 -0.3578148 -0.2879024  0.001080671 -0.3709704 -0.2328004
##           age        dis        rad        tax     ptratio      black
## 1 -0.96130713  0.9497716 -0.5867985 -0.6709807 -0.80239137  0.3552363
## 2  0.79930921 -0.8549214  1.2113527  1.2873657  0.59162230 -0.6363367
## 3 -0.05427143  0.1034286 -0.5857564 -0.5951053  0.01241316  0.2805140
##        lstat        medv
## 1 -0.9571271  1.06668290
## 2  0.8622388 -0.67953738
## 3 -0.1047617 -0.09820229
## 
## Coefficients of linear discriminants:
##                 LD1         LD2
## crim    -0.03206338 -0.19094456
## zn       0.02935900 -1.07677218
## indus    0.63347352 -0.09917524
## chas     0.02460719  0.10009606
## nox      1.11749317 -0.75995105
## rm      -0.18841682 -0.57360135
## age     -0.12983139  0.47226685
## dis      0.04493809 -0.34585958
## rad      0.67004295 -0.08584353
## tax      1.03992455 -0.58075025
## ptratio  0.25864960 -0.02605279
## black   -0.01657236  0.01975686
## lstat    0.17365575 -0.41704235
## medv    -0.06819126 -0.79098605
## 
## Proportion of trace:
##    LD1    LD2 
## 0.8506 0.1494
# the function for lda biplot arrows
lda.arrows <- function(x, myscale = 1, arrow_heads = 0.1, color = "red", tex = 0.75, choices = c(1,2)){
  heads <- coef(x)
  arrows(x0 = 0, y0 = 0, 
         x1 = myscale * heads[,choices[1]], 
         y1 = myscale * heads[,choices[2]], col=color, length = arrow_heads)
  text(myscale * heads[,choices], labels = row.names(heads), 
       cex = tex, col=color, pos=3)
}

# target classes as numeric
classes3 <- as.numeric(boston_scaled3$cluster)
# plot the lda results

plot(lda.fit_cluster, dimen = 2, col = classes3, pch = classes3, main = "LDA biplot using three clusters 1, 2 and 3")
lda.arrows(lda.fit_cluster, myscale = 2)

So I scaled the Boston data set. Defined the k-means ‘km_boston’ clusters and added the cluster number to the scaled data frame. Then I ran the LDA model with the clusters against all other variables and added the arrows.
In the LDA biplot you can see how nicely the clusters are separated from each other - so the clustering actually worked. The most influencial linear separators seem to be “age”, “zn” (lands over2500 sq.ft.), “nox” (nitrogen oxide concentration) and “tax” (full-value property-tax rate).

1.3.7. Super-Bonus

#run the code for the scaled training data set
model_predictors <- dplyr::select(boston_train, -crime)

# check the dimensions
dim(model_predictors)
## [1] 404  13
dim(lda.fit$scaling)
## [1] 13  3
# matrix multiplication
matrix_product <- as.matrix(model_predictors) %*% lda.fit$scaling
matrix_product <- as.data.frame(matrix_product)

# installed plotly package and load it
library(plotly)

crimeplot <- plot_ly(x = matrix_product$LD1, y = matrix_product$LD2, z = matrix_product$LD3, type= 'scatter3d', mode='markers', color = boston_train$crime)
crimeplot
n_boston3 <- nrow(boston_scaled3)

# choose randomly 80% of the rows
ind_boston3 <- sample(n_boston3,  size = n_boston * 0.8)











# kmeans clustering of boston_scaled3
km_cluster <-  kmeans(boston_scaled3, centers = 3)
boston_scaled3$cluster <- km_cluster$cluster


clustertrain <- boston_scaled3[ind_boston3,]



clusterplot <- plot_ly(x = matrix_product$LD1, y = matrix_product$LD2, z = matrix_product$LD3, type= 'scatter3d', mode='markers', color = clustertrain)
clusterplot

The 3D plot of the crime training data set is visible here. The cluster plot does not work… I don’t know why…


Chapter 5: Dimensionality reduction techniques

Data wrangling and performing principal component analysis (PCA) & multiple correspondence analysis (MCA)

Work of week 48 (25.11. - 01.12.2019)


1. Analysis of human data set

The data wrangling was done in two steps: a data frame combination of two data sets and as a second step a refining the data to those variables we want to analyse. The data wrangling scrip was uploaded to my GitHub repository. You can find the data wrangling script here.

1.1. Load the data set

# load necessary packages
library(tidyr)
library(dplyr)
library(corrplot)
library(ggplot2)
library(GGally)
library(knitr)
library(kableExtra)
library(stringr)
library(ggfortify)
library(factoextra)
# load the data set "human"
human <- read.table(file = 
             "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/human_analys.txt")

1.2. The data set (overview and correlations)

The original complete data set from the United Nations was wrangled and combined into the so called “human” data set. This human data set consists of 8 variables with 155 observations. The data set includes following variables:

  • country –> Name of the country
  • ratio.sec.edu –> Ratio of second education of female/male
  • ratio.lab.force –> Ratio of labour forced female/male
  • edu.expect –> Expected years of schooling
  • life.exp –> Life expectancy at birth
  • GNI –> Gross National Income per capita
  • mat.mor.r –> Maternal mortality ratio
  • adol.birth –> Adolescent birth rate
  • rep.parliament –> Percetange of female representatives in parliament

The data was collected from the United Nations. More information about the data and how it was collected can be found here. Technical notes about calculating the human development indices can be found here.

Here you see the structure, the complete table and the summary of the wrangled human data set.

# check the data set "human"
str(human)
## 'data.frame':    155 obs. of  8 variables:
##  $ ratio.sec.edu  : num  1.007 0.997 0.983 0.989 0.969 ...
##  $ ratio.lab.force: num  0.891 0.819 0.825 0.884 0.829 ...
##  $ edu.expect     : num  17.5 20.2 15.8 18.7 17.9 16.5 18.6 16.5 15.9 19.2 ...
##  $ life.exp       : num  81.6 82.4 83 80.2 81.6 80.9 80.9 79.1 82 81.8 ...
##  $ GNI            : int  64992 42261 56431 44025 45435 43919 39568 52947 42155 32689 ...
##  $ mat.mor.r      : int  4 6 6 5 6 7 9 28 11 8 ...
##  $ adol.birth     : num  7.8 12.1 1.9 5.1 6.2 3.8 8.2 31 14.5 25.3 ...
##  $ rep.parliament : num  39.6 30.5 28.5 38 36.9 36.9 19.9 19.4 28.2 31.4 ...
#  data set table
knitr::kable(human) %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(width = "100%", height = "300px")
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Norway 1.0072389 0.8908297 17.5 81.6 64992 4 7.8 39.6
Australia 0.9968288 0.8189415 20.2 82.4 42261 6 12.1 30.5
Switzerland 0.9834369 0.8251001 15.8 83.0 56431 6 1.9 28.5
Denmark 0.9886128 0.8840361 18.7 80.2 44025 5 5.1 38.0
Netherlands 0.9690608 0.8286119 17.9 81.6 45435 6 6.2 36.9
Germany 0.9927835 0.8072289 16.5 80.9 43919 7 3.8 36.9
Ireland 1.0241730 0.7797357 18.6 80.9 39568 9 8.2 19.9
United States 1.0031646 0.8171263 16.5 79.1 52947 28 31.0 19.4
Canada 1.0000000 0.8676056 15.9 82.0 42155 11 14.5 28.2
New Zealand 0.9968520 0.8401084 19.2 81.8 32689 8 25.3 31.4
Singapore 0.9148148 0.7616580 15.4 83.0 76628 6 6.0 25.3
Sweden 0.9908362 0.8880707 15.8 82.2 45636 4 6.5 43.6
United Kingdom 0.9989990 0.8107715 16.2 80.7 39267 8 25.8 23.5
Iceland 0.9934498 0.9108527 19.0 82.6 35182 4 11.5 41.3
Korea (Republic of) 0.8641975 0.6948682 16.9 81.9 33890 27 2.2 16.3
Israel 0.9667812 0.8379161 16.0 82.4 30676 2 7.8 22.5
Luxembourg 1.0000000 0.7848297 13.9 81.7 58711 11 8.3 28.3
Japan 1.0139860 0.6931818 15.3 83.5 36927 6 5.4 11.6
Belgium 0.9348613 0.8010118 16.3 80.8 41187 6 6.7 42.4
France 0.9375000 0.8230519 16.0 82.2 38056 12 5.7 25.7
Austria 1.0000000 0.8064993 15.7 81.4 43869 4 4.1 30.3
Finland 1.0000000 0.8703125 17.1 80.8 38695 4 9.2 42.5
Slovenia 0.9775510 0.8275316 16.8 80.4 27852 7 0.6 27.7
Spain 0.9138167 0.7978723 17.3 82.6 32045 4 10.6 38.0
Italy 0.8844720 0.6655462 16.0 83.1 33030 4 4.0 30.1
Czech Republic 1.0020060 0.7481698 16.4 78.6 26660 5 4.9 18.9
Greece 0.8880597 0.7072000 17.6 80.9 24524 5 11.9 21.0
Estonia 1.0000000 0.8156749 16.5 76.8 25214 11 16.8 19.8
Cyprus 0.9302326 0.7876231 14.0 80.2 28633 10 5.5 12.5
Qatar 1.1305085 0.5319372 13.8 78.2 123124 6 9.5 0.0
Slovakia 0.9959799 0.7448980 15.1 76.3 25845 7 15.9 18.7
Poland 0.9286550 0.7534669 15.5 77.4 23177 3 12.2 22.1
Lithuania 0.9448568 0.8291233 16.4 73.3 24500 11 10.6 23.4
Malta 0.8772379 0.5716440 14.4 80.6 27930 9 18.2 13.0
Saudi Arabia 0.8605974 0.2579821 16.3 74.3 52821 16 10.2 19.9
Argentina 0.9774306 0.6333333 17.9 76.3 22050 69 54.4 36.8
United Arab Emirates 1.1944444 0.5054348 13.3 77.0 60868 8 27.6 17.5
Chile 0.9594241 0.6577540 15.2 81.7 21290 22 55.3 15.8
Portugal 0.9896266 0.8293051 16.3 80.9 25757 8 12.6 31.3
Hungary 0.9918946 0.7466667 15.4 75.2 22916 14 12.1 10.1
Bahrain 1.1031128 0.4510932 14.4 76.6 38599 22 13.8 15.0
Latvia 0.9989899 0.8121302 15.2 74.2 22281 13 13.5 18.0
Croatia 0.9081197 0.7654110 14.8 77.3 19409 13 12.7 25.8
Kuwait 0.9875666 0.5246691 14.7 74.4 83961 14 14.5 1.5
Montenegro 0.8891235 0.7504363 15.2 76.2 14558 7 15.2 17.3
Belarus 0.9436009 0.7939778 15.7 71.3 16676 1 20.6 30.1
Russian Federation 0.9686486 0.7963738 14.7 70.1 22352 24 25.7 14.5
Oman 0.8266200 0.3510896 13.6 76.8 34858 11 10.6 9.6
Romania 0.9358696 0.7503852 14.2 74.7 18108 33 31.0 12.0
Uruguay 1.0815109 0.7239583 15.5 77.2 19283 14 58.3 11.5
Bahamas 1.0410959 0.8738966 12.6 75.4 21336 37 28.5 16.7
Kazakhstan 0.9645749 0.8690629 15.0 69.4 20867 26 29.9 20.1
Barbados 1.0205245 0.8603133 15.4 75.6 12488 52 48.4 19.6
Bulgaria 0.9717868 0.8118644 14.4 74.2 15596 5 35.9 20.4
Panama 1.0821643 0.5990220 13.3 77.6 18192 85 78.5 19.3
Malaysia 0.9130435 0.5880795 12.7 74.7 22762 29 5.7 14.2
Mauritius 0.8517241 0.5876011 15.6 74.4 17470 73 30.9 11.6
Trinidad and Tobago 0.9802956 0.7019868 12.3 70.4 26090 84 34.8 24.7
Serbia 0.7934783 0.7307061 14.4 74.9 12190 16 16.9 34.0
Cuba 0.9428934 0.6200000 13.8 79.4 7301 80 43.1 48.9
Lebanon 0.9566787 0.3286319 13.8 79.3 16509 16 12.0 3.1
Costa Rica 1.0039604 0.5898734 13.9 79.4 13413 38 60.8 33.3
Iran (Islamic Republic of) 0.9201183 0.2255435 15.1 75.4 15440 23 31.6 3.1
Venezuela (Bolivarian Republic of) 1.1141732 0.6452020 14.2 74.2 16159 110 83.2 17.0
Turkey 0.6500000 0.4152542 14.5 75.3 18677 20 30.9 14.4
Sri Lanka 0.9515707 0.4600262 13.7 74.9 9779 29 16.9 5.8
Mexico 0.9191419 0.5644556 13.1 76.8 16056 49 63.4 37.1
Brazil 1.0419847 0.7351485 15.2 74.5 15175 69 70.8 9.6
Georgia 0.9676375 0.7523302 13.8 74.9 7164 41 46.8 11.3
Azerbaijan 0.9620123 0.9037356 11.9 70.8 16428 26 40.0 15.6
Jordan 0.8853503 0.2342342 13.5 74.0 11365 50 26.5 11.6
The former Yugoslav Republic of Macedonia 0.7230216 0.6385185 13.4 75.4 11780 7 18.3 33.3
Ukraine 0.9562044 0.7952167 15.1 71.0 8178 23 25.7 11.8
Algeria 0.8612903 0.2105263 14.0 74.8 13054 89 10.0 25.7
Peru 0.8517398 0.8080569 13.1 74.6 11015 89 50.7 22.3
Albania 0.9306030 0.6854962 11.8 77.8 9943 21 15.3 20.7
Armenia 0.9894737 0.7465565 12.3 74.7 8124 29 27.1 10.7
Bosnia and Herzegovina 0.6432665 0.5951134 13.6 76.5 9638 8 15.1 19.3
Ecuador 1.0177665 0.6614268 14.2 75.9 10605 87 77.0 41.6
China 0.8164117 0.8160920 13.1 75.8 12547 32 8.6 23.6
Fiji 0.9953488 0.5208333 15.7 70.0 7493 59 42.8 14.0
Mongolia 1.0142687 0.8167388 14.6 69.4 10729 68 18.7 14.9
Thailand 0.8750000 0.7967782 13.5 74.4 13323 26 41.0 6.1
Libya 1.3245823 0.3926702 14.0 71.6 14911 15 2.5 16.0
Tunisia 0.7114967 0.3540197 14.6 74.8 10404 46 4.6 31.3
Colombia 1.0233813 0.7001255 13.5 74.0 12040 83 68.5 20.9
Jamaica 1.0541311 0.7912553 12.4 75.7 7415 80 70.1 16.7
Tonga 0.9909400 0.7171582 14.7 72.8 5069 120 18.1 0.0
Belize 1.0079156 0.5978129 13.6 70.0 7614 45 71.4 13.3
Dominican Republic 1.0470810 0.6526718 13.1 73.5 11883 100 99.6 19.1
Suriname 0.9469214 0.5886628 12.7 71.1 15617 130 35.2 11.8
Maldives 0.8348624 0.7251613 13.0 76.8 12328 31 4.2 5.9
Samoa 1.0716667 0.4023973 12.9 73.4 5327 58 28.3 6.1
Botswana 0.9448010 0.8811275 12.5 64.5 16646 170 44.2 9.5
Moldova (Republic of) 0.9689441 0.8506787 11.9 71.6 5223 21 29.3 20.8
Egypt 0.7244224 0.3168449 13.5 71.1 10512 45 43.0 2.2
Gabon 1.4930748 0.8593272 12.5 64.4 16367 240 103.0 16.2
Indonesia 0.8109756 0.6104513 13.0 68.9 9788 190 48.3 17.1
Paraguay 0.8558140 0.6568396 11.9 72.9 7643 110 67.0 16.8
Philippines 1.0345369 0.6411543 11.3 68.2 7915 120 46.8 27.1
El Salvador 0.8440367 0.6050633 12.3 73.0 7349 69 76.0 27.4
South Africa 0.9578393 0.7355372 13.6 57.4 12122 140 50.9 40.7
Viet Nam 0.8342697 0.8880779 11.9 75.8 5092 49 29.0 24.3
Bolivia (Plurinational State of) 0.8054146 0.7935723 13.2 68.3 5760 200 71.9 51.8
Kyrgyzstan 0.9762397 0.7044025 12.5 70.6 3044 75 29.3 23.3
Iraq 0.5537849 0.2134670 10.1 69.4 14003 67 68.7 26.5
Guyana 1.2615063 0.5291925 10.3 66.4 6522 250 88.5 31.3
Nicaragua 1.0287206 0.5902864 11.5 74.9 4457 100 100.8 39.1
Morocco 0.6854305 0.3496042 11.6 74.0 6850 120 35.8 11.0
Namibia 0.9680233 0.8587127 11.3 64.8 9418 130 54.9 37.7
Guatemala 0.9439655 0.5589569 10.7 71.8 6929 140 97.2 13.3
Tajikistan 1.0427632 0.7639429 11.2 69.4 2517 44 42.8 15.2
India 0.4770318 0.3379224 11.7 68.0 5497 190 32.8 12.2
Honduras 1.0852713 0.5162847 11.1 73.1 3938 120 84.0 25.8
Bhutan 0.9855072 0.8639896 12.6 69.5 7176 120 40.9 8.3
Syrian Arab Republic 0.7283951 0.1856946 12.3 69.6 2728 49 41.6 12.4
Congo 0.8446809 0.9383562 11.1 62.3 6012 410 126.7 11.5
Zambia 0.5863636 0.8539720 13.5 60.1 3734 280 125.4 12.7
Ghana 0.6986090 0.9425770 11.5 61.4 3852 380 58.4 10.9
Bangladesh 0.8256659 0.6825208 10.0 71.6 3191 170 80.6 20.0
Cambodia 0.4323144 0.9109827 10.9 68.4 2949 170 44.3 19.0
Kenya 0.8057325 0.8591160 11.0 61.6 2762 400 93.6 20.8
Nepal 0.4633508 0.9173364 12.4 69.6 2311 190 73.7 29.5
Pakistan 0.4186551 0.2967431 7.8 66.2 4866 170 27.3 19.7
Myanmar 1.4967320 0.9137303 8.6 65.9 4608 200 12.1 4.7
Swaziland 0.8423077 0.6131285 11.3 49.0 5542 310 72.0 14.7
Tanzania (United Republic of) 0.5894737 0.9767184 9.2 65.0 2411 410 122.7 36.0
Cameroon 0.6103152 0.8307292 10.4 55.5 2803 590 115.8 27.1
Zimbabwe 0.7854839 0.9275362 10.9 57.5 1615 470 60.3 35.1
Mauritania 0.3971292 0.3628319 8.5 63.1 3560 320 73.3 22.2
Papua New Guinea 0.5241379 0.9527027 9.9 62.6 2463 220 62.1 2.7
Yemen 0.3220974 0.3518006 9.2 63.8 3519 270 47.0 0.7
Lesotho 1.1526316 0.8027211 11.1 49.8 3306 490 89.4 26.8
Togo 0.3995037 0.9913899 12.2 59.7 1228 450 91.5 17.6
Haiti 0.6363636 0.8577465 8.7 62.8 1669 380 42.0 3.5
Rwanda 0.9090909 1.0128957 10.3 64.2 1458 320 33.6 57.5
Uganda 0.6835821 0.9570707 9.8 58.5 1613 360 126.6 35.0
Benin 0.4185185 0.8633461 11.1 59.6 1767 340 90.2 8.4
Sudan 0.6648352 0.4118421 7.0 63.5 3809 360 84.0 23.8
Senegal 0.4675325 0.7500000 7.9 66.5 2188 320 94.4 42.7
Afghanistan 0.1979866 0.1987421 9.3 60.4 1885 400 86.8 27.6
Côte d’Ivoire 0.4651163 0.6437346 8.9 51.5 3171 720 130.3 9.2
Malawi 0.5138889 1.0380368 10.8 62.8 747 510 144.8 16.7
Ethiopia 0.4285714 0.8756999 8.5 64.1 1428 420 78.4 25.5
Gambia 0.5523810 0.8709288 8.8 60.2 1507 430 115.8 9.4
Congo (Democratic Republic of the) 0.3950617 0.9658470 9.8 58.7 680 730 135.3 8.2
Liberia 0.3918575 0.8981481 9.5 60.9 805 640 117.4 10.7
Mali 0.5099338 0.6240786 8.4 58.0 1583 550 175.6 9.5
Mozambique 0.2258065 1.0326087 9.3 55.1 1123 480 137.8 39.6
Sierra Leone 0.4608295 0.9521739 8.6 50.9 1780 1100 100.7 12.4
Burkina Faso 0.2812500 0.8566667 7.8 58.7 1591 400 115.4 13.3
Burundi 0.6385542 1.0158537 10.1 56.7 758 740 30.3 34.9
Chad 0.1717172 0.8080808 7.4 51.6 2085 980 152.0 14.9
Central African Republic 0.3782772 0.8531140 7.2 50.7 581 880 98.3 12.5
Niger 0.3076923 0.4459309 5.4 61.4 908 630 204.8 13.3
# data summary
knitr::kable(summary(human)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11)
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Min. :0.1717 Min. :0.1857 Min. : 5.40 Min. :49.00 Min. : 581 Min. : 1.0 Min. : 0.60 Min. : 0.00
1st Qu.:0.7264 1st Qu.:0.5984 1st Qu.:11.25 1st Qu.:66.30 1st Qu.: 4198 1st Qu.: 11.5 1st Qu.: 12.65 1st Qu.:12.40
Median :0.9375 Median :0.7535 Median :13.50 Median :74.20 Median : 12040 Median : 49.0 Median : 33.60 Median :19.30
Mean :0.8529 Mean :0.7074 Mean :13.18 Mean :71.65 Mean : 17628 Mean : 149.1 Mean : 47.16 Mean :20.91
3rd Qu.:0.9968 3rd Qu.:0.8535 3rd Qu.:15.20 3rd Qu.:77.25 3rd Qu.: 24512 3rd Qu.: 190.0 3rd Qu.: 71.95 3rd Qu.:27.95
Max. :1.4967 Max. :1.0380 Max. :20.20 Max. :83.50 Max. :123124 Max. :1100.0 Max. :204.80 Max. :57.50

Graphical overview of the human data set

# visualization of human data set
ov_human <- ggpairs(human, mapping = aes(), title ="Overview of the human data set", 
                     lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 3)))
ov_human

The overview plot shows the data distributions of all variables in the data set and its correlations to each other. Most of the data is basically normally distributed. GNI, maternal mortality rate and adolescence birth rate show left skewed distribution representing a increased cound on low rates. On the upper part the correlations of the variables are shown as number values. Following you see the correlation matrix and a correlation plot presenting the values graphically.

# calculate the correlation matrix and round it
cor_human <- cor(human) %>% round(digits = 2)

cor_human %>% knitr::kable(caption = "Correlation table of human data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Correlation table of human data set
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
ratio.sec.edu 1.00 0.01 0.59 0.58 0.43 -0.66 -0.53 0.08
ratio.lab.force 0.01 1.00 0.05 -0.14 -0.02 0.24 0.12 0.25
edu.expect 0.59 0.05 1.00 0.79 0.62 -0.74 -0.70 0.21
life.exp 0.58 -0.14 0.79 1.00 0.63 -0.86 -0.73 0.17
GNI 0.43 -0.02 0.62 0.63 1.00 -0.50 -0.56 0.09
mat.mor.r -0.66 0.24 -0.74 -0.86 -0.50 1.00 0.76 -0.09
adol.birth -0.53 0.12 -0.70 -0.73 -0.56 0.76 1.00 -0.07
rep.parliament 0.08 0.25 0.21 0.17 0.09 -0.09 -0.07 1.00
# Specialized the insignificant value according to the significant level
p.mat <- cor.mtest(cor_human)$p

# visualize the correlation matrix
# correlations / colour shows the correlation values
corrplot(cor_human, method="pie", type="lower",  tl.cex = 0.65, p.mat = p.mat, sig.level = 0.01, tl.srt = 45, title="Correlations of the human data set", mar=c(0,0,1,0))  

This correlation plot shows nicely which variable corrate positively or negatively to each other. The crossed square show a correlation which is not significant (p-value > 0.01). Strong positive correlations are seen between live expectation and expected years of education, gross national income with expected years of education and live expectation and adolescent birth rate with maternal mortality. So a better education leads to a higher income and so a higher life expectation. Maternal mortality rate depends on the rate of adolescence birth rate.


1.3. Data Analysis

1.3.1. Principal component analysis (PCA) on the not standardized human data set

Generally: in a PCA the data is transformed into new features called the principal components. The first principal component (PC) captures the maximum amount of variance of the features in the original data. The second PC captures the maximum amount of variability left and is orthogonal to the first PC (in a right angle to the first PC). All PCAs are uncorrelated analysis methods. Here we are using PC1 and PC2 for the data interpretation.

Here the principal component analysis of the not standardized human data set, a table with the values of the principle components 1 & 2 and the PCA biplot.

# perform PCA on not standardized data set
pca_human <- prcomp(human)

# PCA results on PC1 and PC2 on all variables
round(pca_human$rotation[,1:2], digits = 4) %>% knitr::kable(caption = "PCA result on all variables") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
PCA result on all variables
PC1 PC2
ratio.sec.edu 0.0000 0.0007
ratio.lab.force 0.0000 -0.0003
edu.expect -0.0001 0.0076
life.exp -0.0003 0.0283
GNI -1.0000 -0.0058
mat.mor.r 0.0057 -0.9916
adol.birth 0.0012 -0.1256
rep.parliament -0.0001 0.0032
# summary of the pca
s1 <- summary(pca_human)

# round the percentages of variance captured by the pc
pca_prc1 <- round(100 * s1$importance[2, ], digits = 3) 

# Principal components 1 & 2 table
round(100 * s1$importance[2, 1:2], digits = 5) %>% knitr::kable(caption = "Principle components") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Principle components
x
PC1 99.99
PC2 0.01
# prepare a label for the biplot
pca_label1 <-  paste0("GNI (", pca_prc1[1],"%)" )
pca_label2 <-  paste0("Maternal mortality (", pca_prc1[2],"%)" )

# draw the pca biplot
biplot(pca_human, choices = 1:2, cex = c(0.55, 0,9), col = c("grey40", "darkgreen"), xlab = pca_label1, 
                      ylab = pca_label2, main = "PCA biplot of non standardized human data set", margin =c(0,0,5,0)) 

Let’s try another kind of biplot

# more advanced biplot
autoplot(pca_human, data = human, label= TRUE, label.size = 3.0, colour = "darkgreen", loadings = TRUE, loadings.label = TRUE, loadings.colour = "red",) + ggtitle("PCA biplot of not standardized human data set") + xlab(paste0(pca_label1))  + ylab(paste0(pca_label2)) + theme_bw()

Principal component 1 represents 99.99% of the variance of the data set. PC2 captures just 0.01% of the variance. Pc1 is associated with the GNI variable. So, the Gross National Income per capita explains 99.99% of the principal component 1 of the data. 0.01 % is associated with maternal mortalility rate. This is the not standardized data set, that’s why we get this result.

1.3.2. Principal component analysis (PCA) on the standardized human data set

First we are scaling the data set so that the human is standardized so that the mean of all values is 0 and the standard diviation reaches 1. See here the summary table of the standardized data set.

# scaling of human data set
human_stzd <- scale(human)

# summary of standardized human data set
knitr::kable(summary(human_stzd)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11)
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Min. :-2.8189 Min. :-2.6247 Min. :-2.7378 Min. :-2.7188 Min. :-0.9193 Min. :-0.6992 Min. :-1.1325 Min. :-1.8203
1st Qu.:-0.5233 1st Qu.:-0.5484 1st Qu.:-0.6782 1st Qu.:-0.6425 1st Qu.:-0.7243 1st Qu.:-0.6496 1st Qu.:-0.8394 1st Qu.:-0.7409
Median : 0.3503 Median : 0.2316 Median : 0.1140 Median : 0.3056 Median :-0.3013 Median :-0.4726 Median :-0.3298 Median :-0.1403
Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000
3rd Qu.: 0.5958 3rd Qu.: 0.7350 3rd Qu.: 0.7126 3rd Qu.: 0.6717 3rd Qu.: 0.3712 3rd Qu.: 0.1932 3rd Qu.: 0.6030 3rd Qu.: 0.6127
Max. : 2.6646 Max. : 1.6632 Max. : 2.4730 Max. : 1.4218 Max. : 5.6890 Max. : 4.4899 Max. : 3.8344 Max. : 3.1850

In the summary table you can see that all mean values are 0. So the scaling of the human data set was successful.
Now the PCA of this standardized data follows.

# perform PCA on standardized data set
pca_human_stzd <- prcomp(human_stzd)

# Principal components 1 & 2 table
round(pca_human_stzd$rotation[, 1:2], digits = 4) %>% knitr::kable(caption = "PC1 & 2 of standardized human data") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
PC1 & 2 of standardized human data
PC1 PC2
ratio.sec.edu -0.3566 0.0380
ratio.lab.force 0.0546 0.7243
edu.expect -0.4277 0.1394
life.exp -0.4437 -0.0253
GNI -0.3505 0.0506
mat.mor.r 0.4370 0.1451
adol.birth 0.4113 0.0771
rep.parliament -0.0844 0.6514
# PC labels
s2 <- summary(pca_human_stzd)
pca_prc2 <- round(100 * s2$importance[2, ], digits = 1) 

pca_label2.1 <- paste0("Education and health (",pca_prc2[1],"%) ")
pca_label2.2 <- paste0("Female social participation (",pca_prc2[2],"%) ")

# Principal components 1 & 2 table
round(100 * s2$importance[2, 1:2], digits = 2) %>% knitr::kable(caption = "Principle components") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Principle components
x
PC1 53.61
PC2 16.24
# draw the PCA biplot
biplot(pca_human_stzd, choices = 1:2, cex = c(0.5, 0,9), col = c("grey40", "deeppink2"), xlab = pca_label2.1, ylab = pca_label2.2, main = "PCA biplot of standardized human data set", margin =c(0,0,5,0))

Also here another kind of biplot

# more advanced biplot
autoplot(prcomp(human_stzd), data = human, colour = "darkgreen", label= FALSE, label.size = 3.0, loadings = TRUE, loadings.label = TRUE, loadings.colour = "red",) + ggtitle("PCA biplot of standardized human data set") + xlab(paste0(pca_label2.1))  + ylab(paste0(pca_label2.2)) + theme_bw()

The principal component representing 16.2% of the data variation is strongly associated with parliament representatives and the labour force ratio (female / male). So the occurance of of woman in the labour market and in the political representation is mostly associated with principal component 2. The variables which show the strongest association with principal component 1 (representing 53.6% of the data variation) are life expectation, the school period, the ratio of female/male of second education, adolenscence birthrate and maternal mortality. So overall principal component 1 represents the health and education situation of the countries.

The PCA results differ a lot! To perform the PCA we need to make sure that the obsvervations are standardized, which means that the values are transformed in compareable values.
We can see that in the PCA of the non-standardized human data set the principal component 1 represents almost all varation of the data set - so the Gross National Income per capita is the value representing principle component 1.
The PCA of standardized values looks quite different. You can see that the PC1 & PC2 values spread all over the biplot and more variables are associated to the two principal components.

The angle between the arrows represents the original features of the data set and we can interpret it as the correlation between these features. Is the angle between the arrows small, then there is a positive correlation.
Also the angle between the feature and the principal component can be interpreted as the correlation between these two components. Again, if the angle is small there is a positive correlation. The length of the arrows is proportianl to the standard deviation of the features.

Give your personal interpretations
of the first two principal component dimensions based on the biplot drawn after PCA on the standardized human data.

2. Multiple Correspondence Analysis (MCA)

library(FactoMineR)
data(tea)

str(tea)
## 'data.frame':    300 obs. of  36 variables:
##  $ breakfast       : Factor w/ 2 levels "breakfast","Not.breakfast": 1 1 2 2 1 2 1 2 1 1 ...
##  $ tea.time        : Factor w/ 2 levels "Not.tea time",..: 1 1 2 1 1 1 2 2 2 1 ...
##  $ evening         : Factor w/ 2 levels "evening","Not.evening": 2 2 1 2 1 2 2 1 2 1 ...
##  $ lunch           : Factor w/ 2 levels "lunch","Not.lunch": 2 2 2 2 2 2 2 2 2 2 ...
##  $ dinner          : Factor w/ 2 levels "dinner","Not.dinner": 2 2 1 1 2 1 2 2 2 2 ...
##  $ always          : Factor w/ 2 levels "always","Not.always": 2 2 2 2 1 2 2 2 2 2 ...
##  $ home            : Factor w/ 2 levels "home","Not.home": 1 1 1 1 1 1 1 1 1 1 ...
##  $ work            : Factor w/ 2 levels "Not.work","work": 1 1 2 1 1 1 1 1 1 1 ...
##  $ tearoom         : Factor w/ 2 levels "Not.tearoom",..: 1 1 1 1 1 1 1 1 1 2 ...
##  $ friends         : Factor w/ 2 levels "friends","Not.friends": 2 2 1 2 2 2 1 2 2 2 ...
##  $ resto           : Factor w/ 2 levels "Not.resto","resto": 1 1 2 1 1 1 1 1 1 1 ...
##  $ pub             : Factor w/ 2 levels "Not.pub","pub": 1 1 1 1 1 1 1 1 1 1 ...
##  $ Tea             : Factor w/ 3 levels "black","Earl Grey",..: 1 1 2 2 2 2 2 1 2 1 ...
##  $ How             : Factor w/ 4 levels "alone","lemon",..: 1 3 1 1 1 1 1 3 3 1 ...
##  $ sugar           : Factor w/ 2 levels "No.sugar","sugar": 2 1 1 2 1 1 1 1 1 1 ...
##  $ how             : Factor w/ 3 levels "tea bag","tea bag+unpackaged",..: 1 1 1 1 1 1 1 1 2 2 ...
##  $ where           : Factor w/ 3 levels "chain store",..: 1 1 1 1 1 1 1 1 2 2 ...
##  $ price           : Factor w/ 6 levels "p_branded","p_cheap",..: 4 6 6 6 6 3 6 6 5 5 ...
##  $ age             : int  39 45 47 23 48 21 37 36 40 37 ...
##  $ sex             : Factor w/ 2 levels "F","M": 2 1 1 2 2 2 2 1 2 2 ...
##  $ SPC             : Factor w/ 7 levels "employee","middle",..: 2 2 4 6 1 6 5 2 5 5 ...
##  $ Sport           : Factor w/ 2 levels "Not.sportsman",..: 2 2 2 1 2 2 2 2 2 1 ...
##  $ age_Q           : Factor w/ 5 levels "15-24","25-34",..: 3 4 4 1 4 1 3 3 3 3 ...
##  $ frequency       : Factor w/ 4 levels "1/day","1 to 2/week",..: 1 1 3 1 3 1 4 2 3 3 ...
##  $ escape.exoticism: Factor w/ 2 levels "escape-exoticism",..: 2 1 2 1 1 2 2 2 2 2 ...
##  $ spirituality    : Factor w/ 2 levels "Not.spirituality",..: 1 1 1 2 2 1 1 1 1 1 ...
##  $ healthy         : Factor w/ 2 levels "healthy","Not.healthy": 1 1 1 1 2 1 1 1 2 1 ...
##  $ diuretic        : Factor w/ 2 levels "diuretic","Not.diuretic": 2 1 1 2 1 2 2 2 2 1 ...
##  $ friendliness    : Factor w/ 2 levels "friendliness",..: 2 2 1 2 1 2 2 1 2 1 ...
##  $ iron.absorption : Factor w/ 2 levels "iron absorption",..: 2 2 2 2 2 2 2 2 2 2 ...
##  $ feminine        : Factor w/ 2 levels "feminine","Not.feminine": 2 2 2 2 2 2 2 1 2 2 ...
##  $ sophisticated   : Factor w/ 2 levels "Not.sophisticated",..: 1 1 1 2 1 1 1 2 2 1 ...
##  $ slimming        : Factor w/ 2 levels "No.slimming",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ exciting        : Factor w/ 2 levels "exciting","No.exciting": 2 1 2 2 2 2 2 2 2 2 ...
##  $ relaxing        : Factor w/ 2 levels "No.relaxing",..: 1 1 2 2 2 2 2 2 2 2 ...
##  $ effect.on.health: Factor w/ 2 levels "effect on health",..: 2 2 2 2 2 2 2 2 2 2 ...
dim(tea)
## [1] 300  36
# column names to keep in the dataset
keep_tea <- c("Tea", "How", "how", "sugar", "where", "lunch")

# select the 'keep_columns' to create a new dataset
tea_time <- select(tea, one_of(keep_tea))

gather(tea_time) %>% ggplot(aes(value)) + facet_wrap("key", scales = "free") + geom_bar(fill = "darkgreen") + theme(axis.text.x = element_text(angle = 45, hjust = 1, size = 8))

# multiple correspondence analysis
mca_tea <- MCA(tea_time, graph = FALSE)

# summary of the model
summary(mca_tea)
## 
## Call:
## MCA(X = tea_time, graph = FALSE) 
## 
## 
## Eigenvalues
##                        Dim.1   Dim.2   Dim.3   Dim.4   Dim.5   Dim.6
## Variance               0.279   0.261   0.219   0.189   0.177   0.156
## % of var.             15.238  14.232  11.964  10.333   9.667   8.519
## Cumulative % of var.  15.238  29.471  41.435  51.768  61.434  69.953
##                        Dim.7   Dim.8   Dim.9  Dim.10  Dim.11
## Variance               0.144   0.141   0.117   0.087   0.062
## % of var.              7.841   7.705   6.392   4.724   3.385
## Cumulative % of var.  77.794  85.500  91.891  96.615 100.000
## 
## Individuals (the 10 first)
##                       Dim.1    ctr   cos2    Dim.2    ctr   cos2    Dim.3
## 1                  | -0.298  0.106  0.086 | -0.328  0.137  0.105 | -0.327
## 2                  | -0.237  0.067  0.036 | -0.136  0.024  0.012 | -0.695
## 3                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 4                  | -0.530  0.335  0.460 | -0.318  0.129  0.166 |  0.211
## 5                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 6                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 7                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 8                  | -0.237  0.067  0.036 | -0.136  0.024  0.012 | -0.695
## 9                  |  0.143  0.024  0.012 |  0.871  0.969  0.435 | -0.067
## 10                 |  0.476  0.271  0.140 |  0.687  0.604  0.291 | -0.650
##                       ctr   cos2  
## 1                   0.163  0.104 |
## 2                   0.735  0.314 |
## 3                   0.062  0.069 |
## 4                   0.068  0.073 |
## 5                   0.062  0.069 |
## 6                   0.062  0.069 |
## 7                   0.062  0.069 |
## 8                   0.735  0.314 |
## 9                   0.007  0.003 |
## 10                  0.643  0.261 |
## 
## Categories (the 10 first)
##                        Dim.1     ctr    cos2  v.test     Dim.2     ctr
## black              |   0.473   3.288   0.073   4.677 |   0.094   0.139
## Earl Grey          |  -0.264   2.680   0.126  -6.137 |   0.123   0.626
## green              |   0.486   1.547   0.029   2.952 |  -0.933   6.111
## alone              |  -0.018   0.012   0.001  -0.418 |  -0.262   2.841
## lemon              |   0.669   2.938   0.055   4.068 |   0.531   1.979
## milk               |  -0.337   1.420   0.030  -3.002 |   0.272   0.990
## other              |   0.288   0.148   0.003   0.876 |   1.820   6.347
## tea bag            |  -0.608  12.499   0.483 -12.023 |  -0.351   4.459
## tea bag+unpackaged |   0.350   2.289   0.056   4.088 |   1.024  20.968
## unpackaged         |   1.958  27.432   0.523  12.499 |  -1.015   7.898
##                       cos2  v.test     Dim.3     ctr    cos2  v.test  
## black                0.003   0.929 |  -1.081  21.888   0.382 -10.692 |
## Earl Grey            0.027   2.867 |   0.433   9.160   0.338  10.053 |
## green                0.107  -5.669 |  -0.108   0.098   0.001  -0.659 |
## alone                0.127  -6.164 |  -0.113   0.627   0.024  -2.655 |
## lemon                0.035   3.226 |   1.329  14.771   0.218   8.081 |
## milk                 0.020   2.422 |   0.013   0.003   0.000   0.116 |
## other                0.102   5.534 |  -2.524  14.526   0.197  -7.676 |
## tea bag              0.161  -6.941 |  -0.065   0.183   0.006  -1.287 |
## tea bag+unpackaged   0.478  11.956 |   0.019   0.009   0.000   0.226 |
## unpackaged           0.141  -6.482 |   0.257   0.602   0.009   1.640 |
## 
## Categorical variables (eta2)
##                      Dim.1 Dim.2 Dim.3  
## Tea                | 0.126 0.108 0.410 |
## How                | 0.076 0.190 0.394 |
## how                | 0.708 0.522 0.010 |
## sugar              | 0.065 0.001 0.336 |
## where              | 0.702 0.681 0.055 |
## lunch              | 0.000 0.064 0.111 |
# visualize MCA
plot(mca_tea, invisible=c("ind"), habillage = "quali")

fviz_mca_biplot(mca_tea)